Use tensorRT for mxnet model


How to use tensorRT to improve the inference speed for mxnet model, Thanks

Support TensorRT in MXNET

The easiest way to move MXNet model to TensorRT would be through ONNX. Basically you’d export your model as ONNX and import ONNX as TensorRT. However exporting from MXNet to ONNX is WIP and the proposed API can be found here.


Hi, @safrooze, just as you say, exporting from MXNet to ONNX is WIP. If I want to do it now, is there some other ways?


You can consider using this library by NVIDIA in the meantime: However, I have heard that some people have experienced problems with it.