When we do transfer learning in Keras2., the Arguments require "input_shape" and "input_tensor". But I use only input_tensor and haven never used input_shape. I think only input_tensor is enough, and I don't know when to use input_shape. How should I use them separately?
I used input_tensor and input_shape simultaneously with separate value, and only value of input_tensor was adopted and input_shape was ignored.
vgg16_model = VGG16(include_top=False, weights='imagenet',
input_tensor = Input(shape=(150, 150, 3)),
input_shape=(224,224,3))
top_model = Sequential()
top_model.add(Flatten(input_shape=vgg16_model.output_shape[1:]))
top_model.add(Dense(256, activation='relu'))
top_model.add(Dense(1, activation='sigmoid'))
model = Model(input=vgg16_model.input, output=top_model(vgg16_model.output))
model.summary()
Layer (type) Output Shape Param #
================================================================
input_6 (InputLayer) (None, 150, 150, 3) 0
_________________________________________________________________
block1_conv1 (Conv2D) (None, 150, 150, 64) 1792
_________________________________________________________________
block1_conv2 (Conv2D) (None, 150, 150, 64) 36928
_________________________________________________________________
block1_pool (MaxPooling2D) (None, 75, 75, 64) 0
_________________________________________________________________
block2_conv......
I expected I get some errors in this code, but there was no error, and this model could accept the shape of (150, 150, 3). Input_shape=(224,224,3) was ignored.
Can you maybe give me a little help ? Thanks.
The VGG16
code probably simply forgot to check for the two arguments.
It doesn't make sense to have both, of course.
input_shape
when you want the model to create its own input layer automatically with that size. input_tensor
when you have a tensor that you want to be the input. You can use any tensor in input_tensor
, this is meant to use the outputs of other models/layers as the input of the VGG16
. Of course that you can pass a dummy input tensor as you did, there is no reason for the code to complain, it received a tensor, ok.
The only thing there is that the coder forgot to verify "if both arguments exist, thrown an error".