How freeze Batchnorm layer in SymbolBlock

Dear all:
I am now encountering this issue: I want to freeze all the Batch Normalization layer in a gluon.SymbolBlock loaded vi ‘symbol.json’ and ‘.params’ files, i.e. I don’t want to use the local mean and var during training and use the global mean and var instead, and I don not want the global mean and var to be updated, how can I accomplish it?

Any advice will be appreciated, thank you guys!!!

I’m not familiar with SymbolBlock. But if you want to use the global mean and var, you can set use_global_stats as True.

Thank you bro, I know what you are talking about, but in a SymbolBlock, all computation as been transferred into graph computation, thus I can not operate on use_global_stats
Anyway, thank you for replying!!!

https://discuss.gluon.ai/t/topic/5610/5

The answer from @szha:

In [1]: import mxnet as mx

In [2]: net = mx.gluon.model_zoo.vision.mobilenet0_25(pretrained=True)

In [3]: net.features
Out[3]:
HybridSequential(
  (0): Conv2D(3 -> 8, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
  (1): BatchNorm(fix_gamma=False, use_global_stats=False, eps=1e-05, momentum=0.9, axis=1, 
in_channels=8)
  (2): Activation(relu)
...
  (63): Conv2D(1 -> 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=128, bias=False)
  (64): BatchNorm(fix_gamma=False, use_global_stats=False, eps=1e-05, momentum=0.9, axis=1, 
in_channels=128)
  (65): Activation(relu)
...
  (81): GlobalAvgPool2D(size=(1, 1), stride=(1, 1), padding=(0, 0), ceil_mode=True)
  (82): Flatten
)

In [4]: net.features[64]
Out[5]: BatchNorm(fix_gamma=False, use_global_stats=False, eps=1e-05, momentum=0.9, axis=1, 
in_channels=128)

In [5]: net.features[64]._kwargs['use_global_stats'] = True

In [6]: net.features[64]
Out[6]: BatchNorm(fix_gamma=False, use_global_stats=True, eps=1e-05, momentum=0.9, axis=1, 
in_channels=128)

So, edit the layer before you hybridize the net.