Challenge
How to define TimeDistributed Layer input_shape when TimeDistributed received: (None, 1). What is the reading to receive (None, 1)?
Basic Code
n_features = 1
n_seq = 2
n_steps = 2
X = X.reshape((X.shape[0], n_seq, n_steps, n_features))
model.add(TimeDistributed(Conv1D(64, 1, activation='relu'), input_shape=(n_seq, n_steps, n_features)))
Input Shape Error
--> 175 '`TimeDistributed` Layer should be passed an `input_shape ` '
176 f'with at least 3 dimensions, received: {input_shape}')
177 # Don't enforce the batch or time dimension.
ValueError: `TimeDistributed` Layer should be passed an `input_shape ` with at least 3 dimensions, received: (None, 1)
Imports
from keras.models import Sequential
from keras.layers import LSTM
from keras.layers import Dense
from keras.layers import Flatten
from keras.layers import TimeDistributed
from keras.layers.convolutional import Conv1D
from keras.layers.convolutional import MaxPooling1D
Code Ref This code is from an online sample code on RNN.
Solved in shape using this definition, there was batch for shape on input shape param:
input_shape=(None, n_steps, n_features))