Custom Layer with own Cuda code


#1

What is the best way to add a new operator to gluon that uses custom Cuda code for the forward and backward pass?
Are the examples available, I only found examples for custom layers in Python gluon, but not how to call native code.


#2

You should be able to implement an operator in C++ backend and register it. Here is a good tutorial on it: https://github.com/reminisce/mxnet/blob/900a6997ded8f7124fc6323aa66408cb70e8253f/docs/how_to/add_op_in_backend.md Then you can use your backend operator to create a layer using Block or HybridBlock similar to the way it is shown here : (https://github.com/Xilinx/mxnet/blob/master/docs/tutorials/gluon/customop.md#use-customop-together-with-block) . If you want additional examples you can look at Conv2D layer in gluon. For CUDA kernels you can look at specific examples in src/operator/tensor/dot-inl.cuh


#3

Maybe you want to checkout this example of creating custom operators: https://github.com/apache/incubator-mxnet/blob/master/example/numpy-ops/ndarray_softmax.py