Search code examples
pythonkeraskeras-layer

Popping upper layers in keras


Suppose I have the following pretrained model:

from keras.models import Sequential
from keras.layers import Dense

model = Sequential()
model.add(Dense(3, activation='relu', input_dim=5))
model.add(Dense(1))
model.compile(loss='mse', optimizer='adam')

When I run it through the following data (X), I get the shape as expected:

import numpy as np
X = np.random.rand(20, 5)
model.predict(X).shape

giving the shape (20,1)

However, for transfer learning purposes I wish to pop the top layer and run it through the same data.

model.layers.pop()
model.summary()
>>>
Layer (type)                 Output Shape              Param #   
=================================================================
dense_3 (Dense)              (None, 3)                 18        
=================================================================
Total params: 18
Trainable params: 18
Non-trainable params: 0

Looking at model.summary() after model.layers.pop() seems to have popped off the top layer. However, running model.predict(X).shape still results in a (20,1) shape and not (20,3) as expected.

Question: How am I supposed to correctly pop off the last few layers. This is an artificial example. In my case I need to delete the last 3 layers.


Solution

  • Found the answer here: https://github.com/keras-team/keras/issues/8909

    The following is the answer that is needed. A second model had to be created unfortunately, and for some reason @Eric's answer doesn't seem to work anymore as suggested in the other github issue.

    model.layers.pop()
    model2 = Model(model.input, model.layers[-1].output)
    model2.predict(X).shape