Search code examples
tensorflowkerasneural-networkconv-neural-networklstm

ValueError: Input 0 of layer lstm is incompatible with the layer: expected ndim=3, found ndim=4. Full shape received: (None, 32, 24, 7)


I am still figuring out how to resolve this error... As I have to fixed my first layer of input shape to input_shape=(BATCH_SIZE, N_PAST, N_FEATURES) I am getting errors on any error for LSTM and GRU

    model = tf.keras.models.Sequential([
        tf.keras.layers.LSTM(64, return_sequences=True, input_shape=(BATCH_SIZE,N_PAST, N_FEATURES)),
        tf.keras.layers.Dense(N_FEATURES)
    ])

    model.summary()

    optimizer =  tf.keras.optimizers.SGD(lr=1e-8, momentum=0.9)
    model.compile(
        loss="mse",
        optimizer=optimizer,
        metrics=["mae"]
    )
    model.fit(
        train_set, validation_data=valid_set,validation_steps=100, epochs=100
    )

Solution

  • There is never a need to give a model a fixed value for the batch_size dimension, tensorflow will handle this dynamically depending on the given data shape.

    So in the construction of the model:

    tf.keras.layers.Dense(7, input_shape=(N_PAST, N_FEATURES), activation='relu')
    

    When executing summary() this layers should have an input shape of (None, N_PAST, N_FEATURES)