Search code examples
tensorflowkerasdeep-learningneural-network

tf,keras.sequential.pop() can't remove the last layer


I have written a very basic code to test the pop(9 methods. Even though both models use the same layers, they are independent of each other. Here is the code

import tensorflow as tf

if (__name__== "__main__"):
    x= tf.ones((3, 3))
    x= tf.cast(x, tf.float32)

    ####################################################################
    layer1= tf.keras.layers.Dense(2, activation= "relu", name= "nlayer1")
    layer2= tf.keras.layers.Dense(3, activation="relu", name= "nlayer2")
    layer3= tf.keras.layers.Dense(4, name= 'nlayer3')

    ####################################################################
    model_1= tf.keras.Sequential([layer1, layer2, layer3])
    y1= model_1(x)
    ####################################################################

    ####################################################################
    model_2= tf.keras.Sequential(name= 'model_2')
    model_2.add(layer1)
    model_2.add(layer2)
    model_2.add(layer3)
    y2= model_2(x)
    
    ####################################################################
    print('y1= ', y1)
    print('y2= ', y2)
    input('Press Enter to continue...')
    ####################################################################

    print('<<<<<<<<<<<<<<<<<<<<<<<The summary of model_5>>>>>>>>>>>>>>>>>>>>>>>>')
    model_2.summary()
    print('*****************Removing the last layer of model_5******************')
    model_2.pop()
    model_2.summary()

What's wrong with trying to remove a layer of a model which doesn't depend on another model? The thrown error is like this:

ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(type_spec=TensorSpec(shape=(3, 3), dtype=tf.float32, name='nlayer1_input'), name='nlayer1_input', description="created by layer 'nlayer1_input'") at layer "nlayer1". The following previous layers were accessed without issue: []

Does anyone have any idea what may be wrong?

Kind Regards,

Problem with removing the last layer of a neural network using the pop() method


Solution

  • You can't share your layer between models and hope to remove a layer successfully with pop. You have to copy your layers on your second model:

    ####################################################################
    model_2= tf.keras.Sequential(name= 'model_2')
    model_2.add(tf.keras.layers.Dense.from_config(layer1.get_config()))
    model_2.add(tf.keras.layers.Dense.from_config(layer2.get_config()))
    model_2.add(tf.keras.layers.Dense.from_config(layer3.get_config()))
    y2= model_2(x)
    

    Output after pop

    >>> model_2.summary()
    Model: "model_2"
    _________________________________________________________________
     Layer (type)                Output Shape              Param #   
    =================================================================
     nlayer1 (Dense)             (3, 2)                    8         
                                                                     
     nlayer2 (Dense)             (3, 3)                    9         
                                                                     
    =================================================================
    Total params: 17
    Trainable params: 17
    Non-trainable params: 0
    _________________________________________________________________