Use tensorRT for mxnet model


#1

How to use tensorRT to improve the inference speed for mxnet model, Thanks


Support TensorRT in MXNET
#2

The easiest way to move MXNet model to TensorRT would be through ONNX. Basically you’d export your model as ONNX and import ONNX as TensorRT. However exporting from MXNet to ONNX is WIP and the proposed API can be found here.


#3

Hi, @safrooze, just as you say, exporting from MXNet to ONNX is WIP. If I want to do it now, is there some other ways?
Thanks


#4

You can consider using this library by NVIDIA in the meantime: https://github.com/NVIDIA/mxnet_to_onnx. However, I have heard that some people have experienced problems with it.