Recently,I try to use the “tf.contrib.rnn.LayerNormBasicLSTMCell” , but I don't know what's the mean of the argument “dropout_keep_prob”.
Then I look at the Document given by Google. Their explanation is “unit Tensor or float between 0 and 1 representing the recurrent dropout probability value. If float and 1.0, no dropout will be applied.”
But I don't know the difference between “recurrent dropout” and“dropout”.
Recurrent Dropout
is a regularization method for recurrent neural networks. Dropout
is applied to the updates to LSTM memory cells, i.e. it drops out the input/update gate in LSTM. For more information you can refer here.