If I want some parameters in a network to remain fixed during training, I can set grad_req=‘null’. But if I serialize and deserialize the model, I find that my parameters have been set back to grad_req=‘write’.
It is clear that it’s possible to encode grad_req=‘null’ in the serialized model somehow, since batch norm parameters running mean and variance retain grad_req=‘null’ after (de)serialization. How can I do this with other layers?
>>> import mxnet as mx >>> mx.__version__ '1.3.0' >>> >>> data = mx.sym.var('data') >>> b = mx.sym.var('b') >>> y = data * b >>> >>> net = mx.gluon.nn.SymbolBlock(y, [mx.sym.var('data')]) >>> net.collect_params()['b'].grad_req = 'null' >>> net.hybridize() >>> net.initialize() >>> _ = net(mx.nd.array()) >>> >>> net.export('/tmp/grad_req') >>> symbol_file = '/tmp/grad_req-symbol.json' >>> params_file = '/tmp/grad_req-0000.params' >>> deser_net = mx.gluon.nn.SymbolBlock.imports(symbol_file, ['data'], params_file) >>> >>> deser_net.collect_params()['b'].grad_req 'write'