Check failed: axis < ndim && axis >= -ndim axis 1 exceeds the input dimension of 1


Check failed: axis < ndim && axis >= -ndim axis 1 exceeds the input dimension of 1
when i use the mx.sym.sum() as the input of the mx.sym.MakeLoss(),it throw this error.
i don’t know what lead it,do you know?


Do you have sample code that reproduces the problem?


`data = mx.symbol.Variable(‘data’)

label = mx.symbol.Variable(‘softmax_label’)













i use the mnist dataset with data-shape (-1,1,28,28) and label-shape(-1,10).
after i use the MakeLoss,when i run it,the python doesnt work and kill the kernel.
in ubantu and windows it did the same thing.and the version is mxnet 1.1.0 .
thanks for your reply.


MakeLoss expects a vector as input (One loss value for each example in the batch). So, you should sum along axis 1. Please pass axis=1 as a parameter to the sum operator.


i’m sorry to say that after i do so it reacted the same as with no axis=1.i just want to do like mx.sym.softmax_cross_entropy(),but when i use it,it said there are not enough parameters to call it.i saw it as a bug on the github and they fixed it but i still can’t use it in the lastest version.


Just remove “mx.symbol.sum” from the out then it will work

out = mx.symbol.MakeLoss(-mx.symbol.sum(label * mx.symbol.log(fc7)))
out = mx.symbol.MakeLoss(-label * mx.symbol.log(fc7))

actually the thing is MakeLoss take losses for each of the examples in our training rather a sum over whole data. It is kinda silly that mxnet does’t allow us to do so. One thing you can do to calculate training loss later is:-
loss = mx.symbol.MakeLoss(-label*mx.symbol.log(fc7))
cost = mx.symbol.mean(loss)
then minimize loss using module’s forward and backward methods and eval cost separately.
Or if you are using module’s fit method then you can do something like this:-,
eval_data = eval_iter,
optimizer = ‘adam’,
optimizer_params = {‘learning_rate’: 0.001},
eval_metric = ‘mse’, ====> this is the line which will show you overall loss (cost) of the training data
num_epoch = 10)


Oh, so guess you fixed your issue @mouryarishik for Resolving Check failed: axis < ndim && axis >= -ndim axis 1 exceeds the input dimension of 1?


Yeah, :slight_smile: I find a bit silly that mxnet does not allow us to do so.