I am using Keras with Tensorflow as backends and get incompatible errors:
model = Sequential()
model.add(LSTM(64, input_dim = 1))
model.add(Dropout(0.2))
model.add(LSTM(16))
The following error shows:
Traceback (most recent call last):
File "train_lstm_model.py", line 36, in <module>
model.add(LSTM(16))
File "/home/***/anaconda2/lib/python2.7/site-packages/keras/models.py", line 332, in add
output_tensor = layer(self.outputs[0])
File "/home/***/anaconda2/lib/python2.7/site-packages/keras/engine/topology.py", line 529, in __call__
self.assert_input_compatibility(x)
File "/home/***/anaconda2/lib/python2.7/site-packages/keras/engine/topology.py", line 469, in assert_input_compatibility
str(K.ndim(x)))
ValueError: Input 0 is incompatible with layer lstm_2: expected ndim=3, found ndim=2
How can I fix this problem?
Keras version: 1.2.2 Tensorflow version: 0.12
The LSTM
layer is accepting input in shape of (len_of_sequences, nb_of_features)
. The input shape you provided is only 1-dim
so this where error comes from. The exact form of error message comes from the fact that the actual shape of data includes the batch_size
. So the actual shape of data fed to the layer is (batch_size, len_of_sequences, nb_of_features)
. Your shape is (batch_size, 1)
and this is the reason behind 3d
vs 2d
inputs.
Moreover - you might have a similiar problem with a second layer. In order to make your LSTM
layer to return a sequences you should change its definition to:
model.add(LSTM(64, input_shape = (len_of_seq, nb_of_features), return_sequences=True)
or:
model.add(LSTM(64, input_dim = nb_of_features, input_len = len_of_sequence, return_sequences=True)