Mxnet Sym - Change parameter in testing


#1

Is there any example of changing a parameter during testing (and not training)? Im specifically thinking of AlphaDropout - the noise should be added during the training but not testing… Just wanted to see if there was an example


#2

@piiswrong Any ideas here or examples I can look at?


#3

@eric-haibin-lin We’re still not sure how to do this


#4

Could you be more specific regarding what parameter what want to change? Did you add a custom operator for the activation layer?


#5

No, I did (in gluon)

if mx.autograd.is_training():
    noise = F.random.uniform()
else:
    noise = 1
output = output * noise

I’d like to do this only during training - and not testing… How can I do that in mxnet symbol? In gluon I just use mx.autograd.is_training() to check whether I should forward through an operation

@eric-haibin-lin is that clearer?


#6

Couldn’t really see this in the documentation either @eric-haibin-lin


#7

It’s a bit tricky to express that using Symbol. Is there any reason why Gluon is not used?


#8

Legacy codebase :frowning: - is there anything I can look at to mimic the behavior?


#9

You can create a new module and bind with shared_module arg.
For example, let’s say the parameters/variables in net2 are all present in net1:

def net1(...):
    weight = mx.sym.var('w')
    score = mx.sym.FullyConnected(data, weight)
    out = mx.sym.SoftmaxOutput(score, label)

def net2(...):
    weight = mx.sym.var('w')
    score = mx.sym.FullyConnected(data, weight)
    noise = mx.sym.random_uniform(...)
    new_score = noise + score
    out = mx.sym.SoftmaxOutput(new_score, label)

Then net2 can share the parameter in net1 with the following code:

module1 = mx.module.Module(net1, ...)
module2 = mx.module.Module(net2, ...)
module2.bind(ctx, ..., shared_module=module1)

#10

How would that turn off the noise at test time?


#11

Would this work for you:

train_test_gate = mx.sym.Variable('train_test_gate') # an array with (2,) shape
output = output * mx.sym.random.uniform() * train_test_gate[0] + output * train_test_gate[1]

Then you’d want to include train_test_gate in your inputs in your DataIter and set it to mx.nd.array([1, 0]) for train dataset and mx.nd.array([0, 1]) for test dataset.


#12

You can also do this:

is_training = mx.sym.Variable('is_training')
noise = mx.sym.random.uniform()
output = output * mx.sym.where(is_training, noise, mx.sym.ones_like(noise))

and set is_training to 1 (or 0) in your training (or testing) dataset.

I should mention that topic of control flow blocks (e.g. if condition, for-loop, while-loop) in MXNet have been discussed in multiple threads before (like here and here) and the current recommendation is to use imperative if possible.