How to make custom operator which just modify the backward's in_grad value


#1

when i read the paper <<LDMNet: Low Dimensional Manifold Regularized Neural Networks>>,
https://arxiv.org/abs/1711.06246?context=cs
i got a different regularization iterm which come from the output feature of middle layer Relu. i know it is not like the weight decay , and i have to modify the backward of the gradient, but how to do this ? can i make a custom operator which just let the regularization item to be the other loss function as a apart of group loss functions. but can i just let the custom operator’s forward function to pass? cause i just need the gradient, i really hope somebody could help me to do this!!!