I have the following code snippet, which works fine when i am using the imperative mode of Gluon:
class MixedActivation(HybridBlock): def __init__(self, quantization_methods, *args, **kwargs): super(MixedActivation, self).__init__(*args, **kwargs) self.quantization_methods = quantization_methods def hybrid_forward(self, F, x, which_quantize_f=0): # apply one of the two quantization methods, given value of argument which_quantize_f x = self.quantization_methods[int(which_quantize_f.asnumpy())](F, x) return x
However, I cannot hybridize the network, as there is no .asnumpy function for symbols. Is there a way to achieve the same behaviour for symbolic and imperative mode? Basically, I want to use one or another activation function, given the output of the previous layer. I fear this is fundamentally impossible with symbolic/hybrid mode, as the computational graph is then not fixed, but I am not sure.
Thanks for any replies