That’s a really cool idea/trick! I learned some new stuff from it. Thanks Safrooze!
After reading your reply. I realized that my question was misleading, sorry about this. What I really wanted is after training the embedding layers, I want to add some new ‘words’ to the model, but at the same time, I want to keep the weights of the existing ‘vocabulary’ unchanged.
In my case, say I use 2 latent dims, and got three books: 0, 1, 2, the
book_embedding_weight matrix may be like this:
Now, I want to introduce one new ‘book’.
I could create a new instance of the network and reload the old weights and initialize the new weights as zeros:
[[0.3, 0.4], # book 0’ weights, keep this unchanged
[0.8, 0.5], # book 1’ weights, keep this unchanged
[0.2, 0.3], # book 2’ weights, keep this unchanged
[0, 0]] # book3, only learn this weight vector
If I start to train the network, I really just want to learn the weights for the new book. And ideally the weights to the new book are comparable to the weights of ‘old’ book.
Is this even possible in mxnet? I would want to add new ‘users’ as well.