# Neural Collaborative Filtering for Personalized Ranking

Good example:)

Can we define a gluon.dataset like below for putting negative sampling inside train iter instead of calling this function per iteration? I think this may be helpful for simplifying the training progress and reducing memory usage especially dealing with larger dataset like movielens 20 million and other.

```
def __getitem__(self, idx):
if idx % (self.nb_neg + 1) == 0:
idx = idx // (self.nb_neg + 1)
return self.data[idx][0], self.data[idx][1], np.ones(1, dtype=np.float32).item()
else:
idx = idx // (self.nb_neg + 1)
u = self.data[idx][0]
j = mx.random.randint(0, self.nb_items).asnumpy().item()
while (u, j) in self.mat:
j = mx.random.randint(0, self.nb_items).asnumpy().item()
return u, j, np.zeros(1, dtype=np.float32).item()
```

1 Like

Thank you very much for providing the code, I will give it a test ! If it helps, we will revise it.

1 Like

thank you so much. i will try

Hi, I think the ‘num_users’ in evaluate_ranking function should be ‘num_items’.

```
def evaluate_ranking(net, test_input, seq, candidates, num_users, num_items,
ctx):
ranked_list, ranked_items, hit_rate, auc = {}, {}, [], []
# all_items = set([i for i in range(num_users)])
all_items = set([i for i in range(num_items)])
```

You have a typo - eastimating should be estimating. Your typo did however make me hungry