Search code examples
keraslstmkeras-layer

Is there a relation between the number of LSTM units and the length of the sequence to be trained?


I have programmed keras neural network to train on sequences. Does choosing the LSTM units in keras depend on length of the sequence?


Solution

  • There isn't a set way of determining how many units you should have based on your input.

    More units are a way of making the model more complex. Generally speaking, if the look back period for your neural network is longer, then you have more features to train on, which means a more complex model would be better suited for learning your data.

    Personally, I like to use the number of timesteps in each sample as my number of units, and I decrease this number as I move deeper into the network.