Mxnet symbol how to set Loss(total) = 2*Loss_x + 1*Loss_y

#1

mxnet symbol how to set Loss(total) = 2Loss_x + 1Loss_y
Any advice will be appreciated, thanks

#2

You can use mx.sym.MakeLoss https://mxnet.incubator.apache.org/versions/master/api/python/symbol/symbol.html#mxnet.symbol.MakeLoss

loss_total = 2*loss_x + loss_y
loss = mx.sym.MakeLoss(loss_total)
#3

Thanks for your reply.
what is the difference between “mx.sym.Group(2loss_x, loss_y)" and "loss_total = 2loss_x + loss_y
loss = mx.sym.MakeLoss(loss_total)”?
Any advice will be appreciated, thanks

#4

Yes you can in fact also use “mx.sym.Group” in order to group multiple loss layers together.

#5

mx.sym.Group(2*loss_x, loss_y) is not similar to 2*loss_x + loss_y
In the second case you are adding 2 symbols, while in case of .Group you are basically making a symbolic list of elements provided as arguments.
So .Group(2*loss_x, loss_y) = [2*loss_x, loss_y]
And other one is simply 2*loss_x + loss_y

#6

Thanks for your reply. For the backward, the gradients are the same for [2loss_x, loss_y] and 2loss_x + loss_y? I think gradients are the same for these two kinds of losses. But I am not sure about that. Thanks

#7

Thanks for your reply. For the backward, are the gradients the same for [2loss_x, loss_y] and 2loss_x + loss_y? Thanks

#8

According to Custom Loss + L2 Regularization and https://github.com/apache/incubator-mxnet/issues/2677 using mx.sym.Group will add the objectives during backpropagation. So in your case (adding two loss values) you can either use mx.sym.MakeLoss or mx.sym.Group.

#9

Thanks @NRauschmayr for clarifying my misunderstanding, and sorry to @hdjsjyl for providing wrong explanation.