SoftmaxActivation isn't supported by ONNX - workaround

#1

Hi,
I’ve a pretrained model that uses SoftmaxActivation operation to give probability estimations of how close the current output to the desired output.

When exporting this pretrained model to ONNX, it throws out an error that SoftmaxActivation op isn’t supported by ONNX (Using 1.2.2, also tried on other versions - 1.2.3, 1.3,1.4,1.5)

My question is, how can I work around this?
Note: I’m using Symbol API
Note2: My goal is to Import from ONNX to Tensorflow

Possible workaround #1:
I thought about using softmax with axis=1 (which works on nd.array), (will test on the .json directly soon) to achieve the same result of SoftmaxActivation, would that work? (I’ll try to do that on a dummy network until I receive an answer, also update here with any new results)

Possible workaround #2:
Also I thought, using some other layer now like Dropout layer instead (it exports to ONNX successfully), then import it to Tensorflow framework, and there change it from the Dropout to Softmax Activation. (Need to check if it’s possible)

Replacing SoftmaxActivation with SoftmaxOutput
#2

Solved using softmax and axis=1

#3

Thanks for sharing your solution @AnaRhisT94 :+1:

#4

Glad to help (:slight_smile: @ThomasDelteil

1 Like
#5

If you want to add operator export support, you can also register your own operators in your code base. Just follow the same pattern as in this file:

So you would implement a function with the @mx_op.register("SoftmaxActivation") decorator to register it.

You can probably almost directly re-implment the existing softmax export function only except accounting for argument differences (e.g., instead of axis extract channel). That is, this function:

2 Likes
#6

Yeah, I didn’t want to do that if softmax with axis=1 works :slight_smile:
Thank you!