SoftmaxOutput in gluon

Is there a way to use the SoftmaxOutput symbol as loss within gluon?
More specifically, I want that within the cross-entropy loss specific labels (say -1 for example) should be ignored, as it is used in the object detection examples.

To make sure I understood the ask, are you looking to create a custom loss function? If so, that is fairly easy to do in gluon. See https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/gluon/loss.py for examples.

Not really a custom loss function. The ideal case would be to reuse SoftmaxOutput from mxnet.symbol if somehow possible where one can pass the ignore_label parameter.
Otherwise, how to create a mask for a custom loss and adjust the scaling?

Discarding certain labels can be done by passing sample_weight tensor with all 1s except for elements corresponding to ignored labels which are set to 0? However, Iā€™d assume you want the final loss value to be computed only using the valid labels, which is not what you get with the SoftmaxCrossEntropyLoss in gluon, which simply computes mean.
Looks like in TF one can pass the type of reduction (mean or sum or weighted sum by non zero weights) as a parameter. Something similar might be useful in gluon.

There is also no way in gluon to slice the input, or is there?
Something like the following Python/Numpy code:

import numpy as np
a = np.random.randn(12)
a[a < 0]

There is an issue in MXNet which is actively being worked on to support advanced indexing. Please see the issue and related PRs (https://github.com/apache/incubator-mxnet/issues/8084)

1 Like

this actually would be quite simple using the standard SoftmaxCrossEntropyLoss

import mxnet as mx
from mxnet.gluon.loss import SoftmaxCrossEntropyLoss
import numpy as np
x = mx.random.uniform(shape=(4, 10))  # 4-batch, 10 class 
y = mx.nd.array([-1, 0, 2, 5]).reshape((4, 1))
# ignore -1 for example
mask = y > -1
loss = SoftmaxCrossEntropyLoss()
L = loss(x, y, mask)
print(L)
# [ 0.          2.17453551  2.37934875  1.95923615]
# <NDArray 4 @cpu(0)>

1 Like