How to display loss while training?

Hi,

MXNet newbie here,I am playing around with MNIST dataset.

during training,epoch and accuracy are printed out.

How do i print out loss?

I can see that the loss variable is series of values n x batch size.
something like:

[ 2.46115017 1.46115136 1.46115017 …, 1.46115017 1.46115017
2.46112204]
<NDArray 33600 @gpu(0)>

should i assume last value [-1] to be loss?

what is the correct way to compute loss during training (and print it out) ?

Should be the loss for each image in the batch. So just take the average to get the average loss of the batch.

Here is the working code.

num_epochs = 10

for epoch in range(num_epochs):
    cum_loss = 0.0
    for inputs, labels in train_loader:
        inputs = inputs.as_in_context(ctx)
        # ctx can be mx.gpu() if gpu available otherwise mx.cpu()
        labels = labels.as_in_context(ctx)
        # ctx can be mx.gpu() if gpu available otherwise mx.cpu()
        with autograd.record():
            outputs = net(inputs)
            loss = loss_function(outputs, labels)
            # here you are computing the loss
        loss.backward()
        # here you are differentiating your loss with respect to all your model parameters.
        trainer.step(batch_size=inputs.shape[0])
        # running trainer
        cum_loss += loss.mean()
        # accumulating loss of current batch
    epoch_loss = cum_loss.asscalar()/len(train_loader) 
    # len(train_loader) gives number of batches, so you are dividing all accumulated loss by num of batches
    print('After epoch {}: Loss: {}'.format(epoch + 1, epoch_loss))

Hope this helps.