I used the pretrained model to fine tune the last several layers to keep the previous layers’ parameters without changging. However after the fine tune process, I found that the previous layers’ batchnorm moving mean and moving var were all changed. How to fix them?
It sounds like you want to freeze previous layers and use them as a feature extractor, in which case, have you taken a look at this tutorial that describes how to do so with the Symbol API?
If you can include some code, we can try to debug what’s going on in your case.
Also take a look at these tutorials which describe fine-tuning without freezing previous layers: