I have implemented a minimal gated recurrent unit MGU based on this paper https://arxiv.org/pdf/1603.09420.pdf and found the performance and results to be extremely good. Are there any plans to add this as a part of the RNN layers. Also I would like to know if it can be written in a way that allows it to be hybridized. I will keep working on it myself but any help would be nice my code is just an adaptation of the rnn from scratch taken from mxnet-the-straight-dope ch5. And what I have learned from the d2lai dive into deep learning.
I am not aware of any plans to add MGUs as part of RNN layers. Having said that, the MXNet community is always happy about new contributions So feel free to make a pull request to the MXNet repo. To make the MGU cell hybridizeable you need to define it as HybridRecurrentCell https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/gluon/rnn/rnn_cell.py#L315.
Thanks for the reply. I will give this a try I know I will learn a lot while working on it.