Autograd error when creating a custom block

I am trying out a custom block using Gluon and used a simple dummy code to test it. However, I run into the following error:

Cannot differentiate node because it is not in a computational graph. You need to set is_recording to true or use autograd.record() to save computational graphs for backward. If you want to differentiate the same graph twice, you need to pass retain_graph=True to backward.

I am calling autograd.record(), but I think I am missing something very basic here. Could someone point out what’s going wrong?

My dummy test code is as below:

def myloss(x, t):
    return nd.norm(x-t)


class CustomBlock(nn.Block):

def __init__(self, in_dim, **kwargs):
    super(CustomBlock, self).__init__(**kwargs)
    with self.name_scope():
        self.wh_weight = self.params.get(
            'wh_weight', shape=(in_dim, in_dim))

def forward(self, xt):
    with xt.context as ctx:
        result = nd.dot(xt, self.wh_weight.data())
    return result

umodel = CustomBlock(2)
umodel.collect_params().initialize(ctx=ctx)
umodel_trainer = gluon.Trainer(umodel.collect_params(), 'sgd',
                    {'learning_rate': 0.001, 'momentum': 0, 'wd': 0})
with autograd.record():
       data = umodel(nd.array([1,2]))
       target = nd.array([-1,1])
       L = myloss(data, target)
L.backward()

It’s because nd.norm currently doesn’t implement gradient.
We’ll fix it. For now you can use

def myloss(x, t):
    return nd.sqrt(nd.sum((x-t)**2))
1 Like