I am a beginner with TF/Keras/ML and I am working on my first non-guided project. The idea is to create an RNN that can forecast the "Movement" of a given stock (which I currently define as the open price being higher/lower than the close price) for a given day. My idea is then to train the RNN to predict a given days price movement based off of the actual price data and a whole bunch of technical indicators.
And what I want the model to output looks like this.
I then attempt to create two TimeseriesGenerator objects where the scaled raw data is passed in as the data and the ideal output shown above is passed in as the target. I want the model to be able to take in all of this information and output a category which will tell me its predicted price movement. Additionally I would like the model to create forecasts of this predicted price movement for future times.
The actual model itself is fairly simple, a few LSTM layers that feed into dense layers with a final output layer of one neuron which I want to use to determine the category.
model = Sequential()
model.add(LSTM(2000,input_shape=(length,scaled_train.shape[1]), return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(1000,input_shape=(length,scaled_train.shape[1]), return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(500,input_shape=(length,scaled_train.shape[1])))
model.add(Dropout(0.2))
# model.add(Dense(1000))
model.add(Dense(250))
model.add(Dense(1))
model.compile(optimizer='adam',loss='binary_crossentropy')
The error I get when doing all this is a non-descriptive key error which happens either when calling fit_generator on the model or when attempting to get a given input/output combo from the generator itself.
I think I have a misunderstanding as to what the TimeseriesGenerator is actually doing behind the scenes. What is the problem with my approach and how can I correct it to achieve my goal?
You are passing Pandas dataframes. That error is in the access location. Convert the dataframes to numpy matrices with the df.to_numpy()
method see discussion.