I wonder how to determine the learning rate & beta1 parameters of Adam when training a GAN model ?
Thanks in advance !
I wonder how to determine the learning rate & beta1 parameters of Adam when training a GAN model ?
Thanks in advance !
The default parameters for beta1
in MXNet is a pretty good one. Setting learning rate to .001
is also a pretty common thing to do.
Ideally though, you’d do a hyperparameter sweep over a range of hyperparameter configurations to figure what hyperparameter settings perform the best. You can do that pretty easily using the automatic model tuning feature of Amazon SageMaker which works with MXNet models.
Here’s some more details about model tuning on SageMaker:
https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning.html
And here’s an example on the mnist dataset:
Ok, thanks for your advise, I’ll try this tool