SoftmaxActivation isn't supported by ONNX - workaround

Hi,
I’ve a pretrained model that uses SoftmaxActivation operation to give probability estimations of how close the current output to the desired output.

When exporting this pretrained model to ONNX, it throws out an error that SoftmaxActivation op isn’t supported by ONNX (Using 1.2.2, also tried on other versions - 1.2.3, 1.3,1.4,1.5)

My question is, how can I work around this?
Note: I’m using Symbol API
Note2: My goal is to Import from ONNX to Tensorflow

Possible workaround #1:
I thought about using softmax with axis=1 (which works on nd.array), (will test on the .json directly soon) to achieve the same result of SoftmaxActivation, would that work? (I’ll try to do that on a dummy network until I receive an answer, also update here with any new results)

Possible workaround #2:
Also I thought, using some other layer now like Dropout layer instead (it exports to ONNX successfully), then import it to Tensorflow framework, and there change it from the Dropout to Softmax Activation. (Need to check if it’s possible)

Solved using softmax and axis=1

Thanks for sharing your solution @AnaRhisT94 :+1:

Glad to help (:slight_smile: @ThomasDelteil

1 Like

If you want to add operator export support, you can also register your own operators in your code base. Just follow the same pattern as in this file:

So you would implement a function with the @mx_op.register("SoftmaxActivation") decorator to register it.

You can probably almost directly re-implment the existing softmax export function only except accounting for argument differences (e.g., instead of axis extract channel). That is, this function:

2 Likes

Yeah, I didn’t want to do that if softmax with axis=1 works :slight_smile:
Thank you!

hello, i meet the same problem, i am tring exporting a pretrained mxnet model to onnx, it throws out
AttributeError: No conversion function registered for op type SoftmaxActivation yet.

then i try two ways to solve it:
1:using softmax and axis=1
2:use softmaxoutput and mulit_output:1

however it raise another errors,

File “/data2/yangjunpei/jpyangwork/local/lib/python2.7/site-packages/mxnet/base.py”, line 252, in check_call
raise MXNetError(py_str(_LIB.MXGetLastError()))
mxnet.base.MXNetError: Error in operator face_rpn_cls_prob_stride32: [11:49:24] src/nnvm/legacy_op_util.cc:194: Check failed: prop.inputs.size() == iattr->size() (2 vs. 1) op=SoftmaxOutput, inputs.size=2, iattr.size=1, arg.size=2

i do not know how to solve it ,can you give me some suggestions? thanks

@AnaRhisT94 softmax with axis=1 raise error Error in operator face_rpn_cls_prob_stride32: [17:34:07] src/operator/softmax_output.cc:86: Check failed: in_shape->size() == 2U (1 vs. 2) : Input:[data, label]
What happen and how to solve it? thanks you.