Conv 1D layer with weights set by me and controllable gradients

Hi,
I’m a bit new to Gluon and mxnet, I’m sorry if my question is too trivial.
I was trying to implement a Conv 1D operation on a speech waveform.

I have the following,
N = 512
hp = 16
re = nn.Conv1D( in_channels=1, channels=N, kernel_size=N, padding=N//2, strides=hp, groups=1, use_bias=False)

Now I have a matrix br of size (N,1,N) and I wish to initialize re with this br matrix. Also, I’d like to have the control to make this layer trainable or fixed, essentially setting the gradient required to True or False. I have been struggling to find how to do this in the documentation. Could anyone help?

Thanks.

Hi @svj1991,

You’ll find the set_data useful for setting the kernel weights of the convolution, and grad_req = 'null' useful for keeping the parameter fixed. I’ve written up an example below, showing how to set the kernel parameters and then fixing them while the bias of the convoution is randomly initialized and does update as part of training.

import mxnet as mx


N = 4
conv = mx.gluon.nn.Conv1D(in_channels=1, channels=N, kernel_size=N)
conv.weight.grad_req = 'null'

conv.initialize()
custom_weights = mx.nd.random.uniform(shape=(N,1,N))
conv.weight.set_data(custom_weights)

print('\n\n ### BEFORE')
print(conv.weight.data())
print(conv.bias.data())

trainer = mx.gluon.Trainer(conv.collect_params(), optimizer='sgd')

samples = mx.nd.random.uniform(shape=(10,1,15))
with mx.autograd.record():
    output = conv(samples)
output.backward()
trainer.step(samples.shape[0])

print('\n\n ### AFTER')
print(conv.weight.data())
print(conv.bias.data())

Thank you very much thomelane. This was exactly what I was looking for.

Just a quick secondary question in case you might know,
Is there an imeplementation of arctan2 in mxnet? It seems like there isn’t any.

There’s no arctan2 in mxnet. Though you can implement it using mx.nd.arctan.