Search code examples
tensorflowkeraspre-trained-modelmobilenet

How to add extra layer on the top of pretrained MobileNetV3 model?


I have a model which using a pre-trained MobileNetV3Large model and concatenating the like U-net architecture. That was not a problem. But I want to add model1 with this model2. In model2 I have just batch normalization and dropout which I want to add on the top of this model2. I tried many things but it's not properly working. Any idea?!

Model 2

inputs = Input((256,256,3))

# MobileNetV3
mobilenet = MobileNetV3Large(include_top=False, weights="imagenet", input_tensor=inputs)

mobilenet.layers[89]._name = "relu_3"
mobilenet.layers[196]._name = "relu_4"

l4 = mobilenet.get_layer("relu_3").output
b_layer = mobilenet.get_layer("relu_4").output   

up = Conv2DTranspose(256, (2, 2), strides=2, padding="same")(b_layer)
up = Concatenate()([up, l4])
conv = Conv2D(256, (3, 3), activation='relu', padding="same")(up)
conv = Conv2D(256, (3, 3), activation='relu', padding="same")(conv)

#output
outputs = Conv2D(1, 1, padding="same", activation="sigmoid")(conv)

model2 = Model(inputs , outputs)
model2.summary()

Model 1

inputs = Input((256,256,3))

x = Sequential()
x = inputs
x = BatchNormalization()(x)
x = Dropout(0.5)(x)
outputs1 = x

model1 = Model(inputs , outputs1)
model1.summary()

Solution

  • You can stack the models functionally:

    model2 = Model(inputs)
    model1 = Model(model2)
    model3 = Model(inputs, model1)
    

    You can stack the models sequentially:

    model3 = Sequential()
    for layer in model2.layers:
        model3.add(layer)
    
    for layer in model1.layers:
        model3.add(layer)