i am a chinese in Beijing, the website below is more info about my question: https://discuss.gluon.ai/t/topic/7499 1
i need to realize a function which add the extra term as part of ReLU layer’s output’s gradient to the original back-propagating, what that mean is that i want to change the original network back-propagating, add something to effect the normal parameter iteration.
i ever try the multi-mask idea, just write two loss-function based on same layer ReLU, outcome is not good, so now i just want to write a custome operator as middle layer using module not gluon api, this layer is based on layer ReLU , and connect to layer FulyConnected,just like below
the custome layer do not chage anything only add extra gradient in the backward. like this
obviously, i do not know to implement it completely, like how to handle the label or else,can u give me some advice …
keep touch for bettter forward!