One can apply recurrent dropout onto basic LSTM or GRU layers in Keras by passing its value as a parameter of the layer.
CuDNNLSTM and CuDNNGRU are LSTM and GRU layers that are compatible with CUDA. The main advantage is that they are 10 times faster during training. However they lack some of the beauty of the LSTM or GRU layers in Keras, namely the possibility to pass dropout or recurrent dropout values.
While we can add Dropout layers directly in the model, it seems we cannot do that with Recurrent Dropout.
My question is then the following : How to add recurrent dropout to CuDNNGRU or CuDNNLSTM in Keras ?
I don't think we can have it as it is not even supported in the low level (i.e. cuDNN). From François Chollet creator of Keras:
Recurrent dropout is not implemented in cuDNN RNN ops. At the cuDNN level. So we can't have it in Keras.
The dropout option in the cuDNN API is not recurrent dropout (unlike what is in Keras), so it is basically useless (regular dropout doesn't work with RNNs).
Actually using such dropout in a stacked RNN will wreck training.