Search code examples
pythonpytorchrandom-seeddropout

PyTorch non-deterministic dropout


I'm trying to make output of BLSTM deterministic, after investigation its appeared that my dropout layer creates not deterministic dropout masks, so I was researching about how to fix random seed in pytorch. I found this page and other suggestions though I put everything in code it did not help. Here is my code:

import sys 
import random 
import datetime as dt 

import numpy as np 
import torch 

torch.manual_seed(42) 
torch.cuda.manual_seed(42) 
np.random.seed(42)
random.seed(42) 
torch.backends.cudnn.deterministic = True  

ex = torch.ones(10) 
torch.nn.functional.dropout(ex, p=0.5, training=True)                  
# Out[29]: tensor([0., 0., 2., 0., 0., 0., 0., 0., 2., 2.])

torch.nn.functional.dropout(ex, p=0.5, training=True)                  
# Out[30]: tensor([0., 2., 0., 2., 2., 0., 0., 2., 2., 2.])

Please help me get deterministic output from dropout for the same input


Solution

  • every time you want the same output you need to reset the random seed again so:

    >>> import torch
    
    >>> torch.manual_seed(42)
    <torch._C.Generator object at 0x127cd9170>
    
    >>> ex = torch.ones(10)
    
    >>> torch.nn.functional.dropout(ex, p=0.5, training=True)
    tensor([0., 0., 2., 2., 2., 2., 2., 0., 2., 0.])
    
    >>> torch.manual_seed(42)
    <torch._C.Generator object at 0x127cd9170>
    
    >>> torch.nn.functional.dropout(ex, p=0.5, training=True)
    tensor([0., 0., 2., 2., 2., 2., 2., 0., 2., 0.])
    

    You may want to keep resetting all those random seeds in general though, you can find a lot of different randomness when building neural nets in python