Neural Collaborative Filtering for Personalized Ranking

https://d2l.ai/chapter_recommender-systems/neumf.html

Good example:)
Can we define a gluon.dataset like below for putting negative sampling inside train iter instead of calling this function per iteration? I think this may be helpful for simplifying the training progress and reducing memory usage especially dealing with larger dataset like movielens 20 million and other.

def __getitem__(self, idx):
    if idx % (self.nb_neg + 1) == 0:
        idx = idx // (self.nb_neg + 1)
        return self.data[idx][0], self.data[idx][1], np.ones(1, dtype=np.float32).item()
    else:
        idx = idx // (self.nb_neg + 1)
        u = self.data[idx][0]
        j = mx.random.randint(0, self.nb_items).asnumpy().item()
        while (u, j) in self.mat:
            j = mx.random.randint(0, self.nb_items).asnumpy().item()
        return u, j, np.zeros(1, dtype=np.float32).item()

Thank you very much for providing the code, I will give it a test ! If it helps, we will revise it.:smiley: