Search code examples
pythonkeraskeras-layer

What is the difference between dropout layer and dropout parameter in any keras layer


What is the difference between the Dropout layer and the dropout and recurrent_droput parameters in keras? Do they all serve the same purpose?

Example:

model.add(Dropout(0.2))  # layer
model.add(LSTM(100, dropout=0.2, recurrent_dropout=0.2))  # parameters

Solution

  • Yes they have the same functionality, dropout as a parameter is used before linear transformations of that layer (multiplication of weights and addition of bias). Dropout as layer can be used before an activation layer too.

    recurrent_dropout also has same functionality but different direction(usually dropouts are between input and output, it is between timestamps)