I am building a CNN with Keras using a TensorFlow backend with the following structure:
# Create the Second Model in Ensemble
def createModel(self, model_input, n_outputs, first_session=True):
if first_session != True:
model = load_model('ideal_model.hdf5')
return model
# Define Input Layer
inputs = model_input
# Define Max Pooling Layer
conv = MaxPooling2D(pool_size=(3, 3), padding='same')(inputs)
# Define Layer Normalization Layer
conv = LayerNormalization()(inputs)
# Define Leaky ReLU Layer
conv = LeakyReLU(alpha=0.1)(conv)
# Define Dropout Layer
conv = Dropout(0.2)(conv)
# Define First Conv2D Layer
conv = Conv2D(filters=64,
kernel_size=(3, 3),
activation='relu',
padding='same',
strides=(3, 2))(conv)
conv = Dropout(0.3)(conv)
# Define Second Conv2D Layer
conv = Conv2D(filters=32,
kernel_size=(5, 5),
activation='relu',
padding='same',
strides=(3, 2))(conv)
conv = Dropout(0.3)(conv)
# Define Softmax Layer
conv = Softmax(axis=1)(conv)
# Define Reshape Layer
conv = Reshape((conv._keras_shape[1]*conv._keras_shape[2]*conv._keras_shape[3],))(conv)
# Define Sigmoid Dense Layer
conv = Dense(64, activation='sigmoid')(conv)
# Define Output Layer
outputs = Dense(n_outputs, activation='softmax')(conv)
# Create Model
model = Model(inputs, outputs)
model.summary()
return model
Currently, I am running into a bit of trouble since I am trying to use a Reshape layer to flatten the tensor, and I am trying to avoid hard-coding the dimensions of the output from the previous layer into the Reshape layer, if possible. (Note: Flatten layers are not supported by the kernels in the FPGA on which the program will ultimately run, so I cannot use them.) The above code produces the following error:
AttributeError: 'Tensor' object has no attribute '_keras_shape'
This occurs because I had to import the layers using tensorflow.keras.layers
(as opposed to keras.layers
) due to the LayerNormalization
layer at the beginning of the model architecture.
So, I was wondering if there is a method to get the output shape of a specific layer in tensorflow.keras.layers
before compiling the model.
conv.shape
or maybe tf.shape(conv)