Finetune problem with gluon on custom net


#1

Hi!
I try to change num of outputs in my pretrained model like this (from 10 to 20):

net.load_params(args.model_name)
params for conv20_cls layer looks like:
conv20_cls -> (params of conv20_cls: conv_57_bias and conv_57_weights)

After that I changing otput:

with net.name_scope():
conv20_cls = gluon.nn.HybridSequential(prefix=‘output_’)
conv20_cls.add(gluon.nn.Conv2D(20, kernel_size = 1, padding=0, strides=1, use_bias=True))

    net.conv20_cls = conv20_cls
    net.hybridize()
    net.conv20_cls.initialize(mx.init.Xavier(magnitude=2.24), ctx=ctx)

But after lerning process (finetune) of this model I can’t load params in my model, because net.collect_params() gives me new params name of this layer: (conv20_cls).

params of conv20_cls looks like:
conv20_cls -> (params of conv20_cls: conv_67_bias and conv_67_weights)

How I can solve this problem?


#2

Hi, can you post the code of how you’re saving and loading the finetuned model? Also can you post the error you’re getting when you try to load the finetuned model. Also try using net.save_parameters and net.load_parameters when saving the loading the finetuned model and see if that resolves your issue.


#3

You probably need to use net.save_parameters again and load back by net.load_parameters.


#4

I load params in the following way:

net.load_params(args.model_name)

And save like that:

net.save_params(args.model_name)

And when I try to load fine tune params, I get error:

AssertionError: Parameter ‘conv67_weight’ loaded from file ‘finetune_model’ is not present in ParameterDict, choices are:…

I suppose not entirely correct reassign old classifier:

net.conv20_cls = conv20_cls


#5

Hey. did you try net.save_parameters and net.load_parameters