Search code examples
pythondeep-learningpytorchdropout

Dropout Layer with zero dropping rate


I'm having trouble understanding a certain aspect of dropout layers in PyTorch.

As stated in the Pytorch Documentation the method's signature is torch.nn.Dropout(p=0.5, inplace=False) where p is the dropout rate.

What does this layer do when choosing p=0? Does it change its input in any way?


Solution

  • Dropout with p=0 is equivalent to the identity operation.

    In fact, this is the exact behaviour of Dropout modules when set in eval mode:

    During evaluation the module simply computes an identity function.