How to update learning rate during training with symbol programming

With GLUON, I know that trainer.set_learning_rate(trainer.learning_rate*0.9) can be used to update learning rate in each batch processing during the training stage. However, if I program with Symbol, how can I do that?

For example, using the following codes, how to update the learning rate in each batch loop?

mod = mx.mod.Module(symbol=net, …)
mod.bind(data_shapes=…)
mod.init_params(initializer=mx.init.Xavier(magnitude=2.), force_init=False)
lr_sch = mx.lr_scheduler.FactorScheduler(step=800, factor=0.9)
mod.init_optimizer(optimizer=‘adam’, optimizer_params=((‘learning_rate’, 0.01), (‘lr_scheduler’, lr_sch)))

for epoch in range(0, 10):
for batch_idx, (data, label) in enumerate(train_iter):

            nb = mx.io.DataBatch(data=[data], label=[labels_tile], ...)
            
            mod.forward(nb, is_train=True)

            value = custom_metric(nb.label[0], mod.get_outputs()[0])

            mod.backward()
            mod.update()

In your code:
lr_sch = mx.lr_scheduler.FactorScheduler(step=800, factor=0.9)
is updating the learning rate after 800 iterations. If you want to update the learning rate in each iteration then set step=1

1 Like

Thanks a lot for your reply. In fact, I need the following update during training.

if condition is True:
lr = lr*0.9

Here, condition is a runtime condition.

You could do the following in the training loop:

if condition is True:
     mod._optimizer.lr = mod._optimizer.lr * 0.9

And use the default learning rate scheduler instead of mx.lr_scheduler.FactorScheduler(step=800, factor=0.9)

2 Likes

Thank a lot for your suggestions.