Search code examples
pythontensorflowkeraslstmrecurrent-neural-network

Error in reverse scaling outputs predicted by a LSTM RNN


I used the LSTM model to predict the future open price of a stock. Here the data was preprocessed and the model was built and trained without any errors, and I used Standard Scaler to scale down the values in the DataFrame. But while retrieving the predictions from the model, when I used the scaler.reverse() method it gave the following error.

ValueError: non-broadcastable output operand with shape (59,1) doesn't match the broadcast shape (59,4)

The complete code is a too big jupyter notebook to directly show, so I have uploaded it in a git repository


Solution

  • This is because the model is predicting output with shape (59, 1). But your Scaler was fit on (251, 4) data frame. Either create a new scaler on the data frame of the shape of y values or change your model dense layer output to 4 dimensions instead of 1. The data shape on which scaler is fit, it will take that shape only during scaler.inverse_transform.

    Old Code - Shape (n,1)

    trainY.append(df_for_training_scaled[i + n_future - 1:i + n_future, 0])

    Updated Code - Shape (n,4) - use all 4 outputs

    trainY.append(df_for_training_scaled[i + n_future - 1:i + n_future,:])