Search code examples
pytorchtorchvision

Changing Dropout value during training


How can I change Dropout during training? For example

Dropout= [0.1, 0.2, 0.3]

I tried passing it as as a list but I couldn't make it work.


Solution

  • To change the dropout probability during training, you should use the functional version, i.e. torch.nn.functional.dropout.

    The input arguments to the functional version of dropout are

    • the input tensor
    • the dropout probability (which you can alter)
    • a boolean to indicate if it is in training mode (you can use the self.training)
    • and a flag to indicate if you want the operation to be performed in place.

    Thus, you can alter the probability of the dropout in your forward method, according to your needs.

    For example, you can do in your forward method:

    
    def forward(self, x):
    
        ...
    
        # apply some layers to the input
        h = self.my_layers(x)
    
        # set the value of p
        p = self.get_value_for_p()
    
        # apply dropout with new p
        h = torch.nn.functional.dropout(h, p, self.training)
    
        ...
    

    More on the functional version of dropout, here: https://pytorch.org/docs/stable/nn.functional.html#dropout-functions