Search code examples
machine-learningneural-networkkerasdropout

Applying dropout to input layer in LSTM network (Keras)


Is it possible to apply dropout to input layer of LSTM network in Keras?

If this is my model:

model = Sequential()
model.add(LSTM(10, input_shape=(look_back, input_length), return_sequences=False))
model.add(Dense(1))

The goal is to achieve the effect of:

model = Sequential()
model.add(Dropout(0.5))
model.add(LSTM(10, input_shape=(look_back, input_length), return_sequences=False))
model.add(Dense(1))

Solution

  • You can use the Keras Functional API, in which your model would be written as:

    inputs = Input(shape=(input_shape), dtype='int32')
    x = Dropout(0.5)(inputs)
    x = LSTM(10,return_sequences=False)(x)
    

    define your output layer, for example:

    predictions = Dense(10, activation='softmax')(x)
    

    and then build the model:

    model = Model(inputs=inputs, outputs=predictions)