Search code examples
pythonkeraskeras-layer

What is the right way to freeze a convoloutional layer in keras


I want to freeze a single convolutional layer in my model and I did it by passing traninable=False parameter in the convolution layer similar to Dense layer Dense(32, trainable=False)

from keras.layers import Dense, Dropout, Activation, Flatten
from keras.models import Sequential
from keras.layers.normalization import BatchNormalization
from keras.layers import Conv2D,MaxPooling2D,ZeroPadding2D,GlobalAveragePooling2D

model = Sequential()
model.add(Conv2D(32, (3, 3), input_shape=(28,28,1)))
model.add(BatchNormalization(axis=-1))
model.add(Activation('relu'))
model.add(Conv2D(32, (3, 3)))
model.add(BatchNormalization(axis=-1))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2,2)))

model.add(Conv2D(64,(3, 3),trainable=False)) #freezed layer
model.add(BatchNormalization(axis=-1))
model.add(Activation('relu'))
model.add(Conv2D(64, (3, 3)))
model.add(BatchNormalization(axis=-1))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=(2,2)))
model.add(Flatten())

# Fully connected layer
model.add(Dense(512))
model.add(BatchNormalization())
model.add(Activation('relu'))
model.add(Dropout(0.2))
model.add(Dense(10))

model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

The model seems to compile without any error but when I checked keras docs the Conv2D dosen't seems to have a parameter named trainable. Is my way of freezing convolutional layer valid and what is happening here?


Solution

  • Yes, its valid and correct.

    the trainable parameter is used by the parent class of all layers (which is called Layer) and makes sure that the parameters of that layer are not included in the gradient as trainable parameters.