Relation between Trainer.step and backward


#1

Hi
Today I have read a example of gan which have 2 Trainer .TrianerG and TrainerD
if a framework have 2 or more backward and trainer.step options ,How could they know which one should working corresponding the loss?
eg

for(loop)

loss1(x,y)
autograd.backward(loss1)
trainer1.step(batch_size)

for(loop)

loss2(y,z)
autograd.backward(loss2)
trainer2.step(batch_size)

Do they(2 trainers) get confused that which grads should be update ?
why they can work propely ?


#2

When you create a Trainer object, you initialize it with the list of parameters that that trainer object is responsible for optimizing. When you call Trainer.step(), it simply iterates through all the parameters that it is responsible for and uses the gradient of these parameters to perform an optimization step.