Internals of gluon custom layer’s backward

It looks like that gluon custom layers don’t need backward function. (ref)

Then, where is the default implementation? How the gluon calculates backward gradients?

With autograd. Check out these examples: https://mxnet.incubator.apache.org/api/python/autograd/autograd.html

Just like Mxnet knows the derivative of, for example, x * x +1 (first example in the link) it knows the derivatives of the calculations that you do in your forward passes.

Thank you adrian,

then, where can I find the implemenation or explanation of the theory(about calcuation of gradients automatically) ?

See the following links for some resources:


https://d2l.ai/chapter_crashcourse/autograd.html
https://gluon.mxnet.io/chapter01_crashcourse/autograd.html
http://beta.mxnet.io/guide/packages/autograd/autograd.html

Hope that helps