MXNet Forum

Failed to convert symbol for mixed precision inference


#1

I am trying to do float16 inference with a symbolic model trained with float32. I had success with gluon models, but not with symbols.

According to the docs I add a float16 data symbol:

sym, arg_params, aux_params = mx.model.load_checkpoint(param_path, 0)
data = mx.sym.Variable(name="data")
data = mx.sym.Cast(data=data, dtype='float16')
sym = sym(data=data)

Then I try to bind the symbol which results into an error:

self.mod = mx.mod.Module(symbol=sym, context=ctx, label_names=None)
self.mod.bind(for_training=False, data_shapes=[('data', (1, 3, 512, 512))], label_shapes=self.mod._label_shapes)

Error in operator relu4_3_anchors: [23:58:11] include/mxnet/operator.h:228: Check failed: in_type->at(i) == mshadow::default_type_flag || in_type->at(i) == -1 Unsupported data type 2

Any idea how to convert the symbol properly?


#2

Do you have to use symbol API or can you use SymbolBlock in Gluon with the saved symbolic model?


#3

I guess I could use Gluon SymbolBlock. My hypothesis is currently that one of the contrib operator in this model is not supporting float16.


#4

Possible. Did you try SymbolBlock? Do you get the same thing?


#5

Yeah unfortunately. I will stick to float32 for the time being then.