How to make custom operator which just modify the backward's in_grad value

when i read the paper <<LDMNet: Low Dimensional Manifold Regularized Neural Networks>>,


i got a different regularization iterm which come from the output feature of middle layer Relu. i know it is not like the weight decay , and i have to modify the backward of the gradient, but how to do this ? can i make a custom operator which just let the regularization item to be the other loss function as a apart of group loss functions. but can i just let the custom operator’s forward function to pass? cause i just need the gradient, i really hope somebody could help me to do this!!!

Hey @mxnetwqs,
Have a look at this custom operator documentation:


You can have a custom operator that just forward the input in the forward pass and add a custom value to the gradient in the backward pass.

1 Like

thank all nice guys!
i complete the the task in several ways! have a good time with you! let’s keep forward together!!!