Search code examples
pythontensorflowkerasheatmaptransfer-learning

Acces to last convolutional layer transfer learning


I'm trying to get some heatmaps from a computervision model that's it's already working to classify images but I'm finding some difficulties. This is the model summary:

model.summary()
Model: "model_4"

Layer (type)                 Output Shape              Param #   
=================================================================
input_9 (InputLayer)         [(None, 512, 512, 1)]     0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 512, 512, 3)       30        
_________________________________________________________________
densenet121 (Functional)     (None, 1024)              7037504   
_________________________________________________________________
dense_4 (Dense)              (None, 100)               102500    
_________________________________________________________________
dropout_4 (Dropout)          (None, 100)               0         
_________________________________________________________________
predictions (Dense)          (None, 2)                 202       
=================================================================
Total params: 7,140,236
Trainable params: 7,056,588
Non-trainable params: 83,648

As part of the standard procces to create a heatmap, I know I have to acces to the last convolutional layer in the model, that in this case I'll say it's a layer inside the Densenet121, but I can not find a way to access to all the layers belonging to densenet121.

Right now, I've been using conv2d_4 layer to run some tests, but I feel is not the right way because that layer is before all the Transfer learning work from densenet.

Also, I just looked up for Funcitnal layers in KErar official documentation but I cound't find it, so I guess it's not a layer, it's like the hole densenet model embedded there, but I can not find a way to access.

By the way, here I share the model construction because it may help to answer this:

from tensorflow.keras.applications.densenet import DenseNet121

num_classes = 2
input_tensor = Input(shape=(IMG_SIZE,IMG_SIZE,1))
x = Conv2D(3,(3,3), padding='same')(input_tensor)   
x = DenseNet121(include_top=False, classes=2, pooling="avg", weights="imagenet")(x)

x = Dense(100)(x)
x = Dropout(0.45)(x)
predictions = Dense(num_classes, activation='softmax', name="predictions")(x)
model = Model(inputs=input_tensor, outputs=predictions)   

Solution

  • I found you can use .get_layer() twice to acces layers inside functional densenet model embebeed in the "main" model.

    In this case I can use model.get_layer('densenet121').summary() to check all thje layer inside the embebeed model, and then use them with this code: model.get_layer('densenet121').get_layer('xxxxx')