nn.Dropout随机丢神经元的用法
•
人工智能
前言:
pytorch与tensorflow中均有nn.dropout,两者之间的使用方法,下面将介绍。
一、torch.nn.dropout
说明文档:
r"""During training, randomly zeroes some of the elements of the input
tensor with probability :attr:`p` using samples from a Bernoulli
distribution. Each channel will be zeroed out independently on every forward
call.
This has proven to be an effective technique for regularization and
preventing the co-adaptation of neurons as described in the paper
`Improving neural networks by preventing co-adaptation of feature
detectors`_ .
Furthermore, the outputs are scaled by a factor of :math:`\frac{1}{1-p}` during
training. This means that during evaluation the module simply computes an
identity function.
大致的翻译:
在训练期间,随机地将输入的一些元素归零,以概率为`p`,使用伯努利分布的样本。每个通道将在每次前向调用时被独立清零。
这已被证明是一种有效的正则化技术,可以 防止神经元的共同适应,如论文中所述 &#
本文来自网络,不代表协通编程立场,如若转载,请注明出处:https://www.net2asp.com/56d07beea3.html
