How can I change Dropout during training? For example
Dropout= [0.1, 0.2, 0.3]
I tried passing it as as a list but I couldn't make it work.
To change the dropout probability during training, you should use the functional
version, i.e. torch.nn.functional.dropout
.
The input arguments to the functional version of dropout are
self.training
)Thus, you can alter the probability of the dropout in your forward
method, according to your needs.
For example, you can do in your forward
method:
def forward(self, x):
...
# apply some layers to the input
h = self.my_layers(x)
# set the value of p
p = self.get_value_for_p()
# apply dropout with new p
h = torch.nn.functional.dropout(h, p, self.training)
...
More on the functional version of dropout, here: https://pytorch.org/docs/stable/nn.functional.html#dropout-functions