Search code examples
pythonlstmrecurrent-neural-networkprediction

Why to invert predictions on LSTM-RNN?


Here: https://machinelearningmastery.com/time-series-prediction-lstm-recurrent-neural-networks-python-keras/ under the paragraph: LSTM Network for Regression this guy inverts predictions inside the LSTM-RNN code. If I remove those lines of code, the resulted predictions are useless. I mean the model does not predict anything. So, my question is what the code that invert predictions really does? Why does he use it?


Solution

  • In the field of time series forecasting, raw data generally have large values. For example, in the field of load forecasting, the load value at each moment is about tens of thousands. In order to speed up the convergence of the model, we generally need to normalize the original data. For example, use MinMaxScaler to adjust the range of all data to [0, 1].

    It is worth noting that after normalizing the data, the value predicted by the model will also be in the range [0, 1] (if the model converges well). At this time, the prediction result of the model cannot be used directly (the load value in the real scene cannot be in the range of [0, 1]), so we need to inverse normalize the prediction result, that is, inverse_transform.