Apologize if I am missing something obvious, I have an LSTM-related question.
In Keras, LSTM layer (https://keras.io/layers/recurrent/#lstm) has
dropout: Float between 0 and 1. Fraction of the units to drop for the linear transformation of the inputs.
recurrent_dropout: Float between 0 and 1. Fraction of the units to drop for the linear transformation of the recurrent state.
If my understanding is correct, the Gluon LSTM dropout doesn’t correspond to any of these:
dropout (float, default 0) – If non-zero, introduces a dropout layer on the outputs of each RNN layer except the last layer.
Could anyone please shed some more light on this? Can we match the Keras functionality somehow?