During the hyper-parameter tuning, it is often useful to ensure that your model produces the same result with the same set of hyper-parameters. I usually set the random seed for this:
This works fine for most cases but when I apply dropout, it starts producing different result:
class LRModel(gluon.Block): def __init__(self, **kwargs): super().__init__(**kwargs) self.dropout = gluon.nn.Dropout(0.2) self.out = gluon.nn.Dense(10) def forward(self, x): x = self.dropout(x) x = self.out(x) return x
Is there a way of applying seeded randomness to dropout? I also have a similar issue with the 2D-convoluion as well. Thanks!