Paddind zeros in embedding

In Pytorch while creating embeddings if we give padding_idx= 0, it pads the output with zeros whenever it encounters the index.

For Eg :
self.embedding = nn.Embedding(self.vocab_size, self.embedding_dim,
padding_idx=0)

Is there a similar way to do in mxnet without manually adding zeros to the embedding.

I am not too familiar with this functionnality of pytorch but there doesn’t seem to have the equivalent feature in the MXNet Gluon Embedding API. You can create an issue [FeatureRequest] in github if you would like this to be implemented.