Search code examples
pythondeep-learningpytorch

Freezing Majority of Layers Except Specific Layers in PyTorch


I have a model and I trained that on a dataset. But I changed its last two layers and I want to do a fine-tuning. In other words, I want to freeze all the layers of the model except those two layers (self.conv_6 and self.sigmoid)and training the model on that dataset again.

This is my model:

class model(nn.Module):
def __init__(self, pretrained=False):
    super(model, self).__init__()
    
    self.conv_1 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))
    self.conv_2 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))
    self.conv_3 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))
    self.conv_4 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))
    self.conv_5 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))
    self.conv_6 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))

    
    self.sigmoid = nn.Sigmoid()
    
def forward(self, x):
    
    
    x = self.conv_1(x)
    x = self.conv_2(x)
    x = self.conv_3(x)
    x = self.conv_4(x)
    x = self.conv_5(x)
    
    x = self.conv_6(x)
    y = self.sigmoid(x) 
    return y

How I can freeze all the layers except those two layers?


Solution

  • You can freeze the whole module and then unfreeze the layers you want, all with requires_grad_:

    >>> f = model()
    >>> f.requires_grad_(False)
    >>> f.conv_6.requires_grad_(True)
    

    By the way, nn.Sigmoid is a non-parametrized function.