How to perform multiplication between the output of fully connected layer and an NdArray?

#1

I am using the resnet implementation in https://github.com/khetan2/MBEM/blob/master/resnet.py
along with the data and image rec iterators already predefined in the above implementation, I have a custom ndArray iterator which returns a batch_size*10 array each time. Now during each batch when the final layer output is obtained (as a symbol) I want to multiply the ndArray obtained from my custom iterator and the output of the last layer so that I can create a custom loss function further. But since one is an array and the other is a symbol the error of either using both as symbols or arrays persists. Can someone help me out on how to go about doing this operation? …or how to access the data from my array iterator as a symbol.?

My iterator which returns array
var1= mx.io.NDArrayIter(labels_var,np.zeros(n),batch_size=500,shuffle = False,)
var2 = mx.io.NDArrayIter(np.zeros(n1), np.zeros(n), batch_size = batch_size,shuffle = False,)
global var_iter
var_iter = MultiIter([var1,var2])

and the operation that I want to perform :slight_smile:
for batch in var_iter:
variancet = batch.data[0]
ar = mx.sym.sum(mx.sym.broadcast_mul(softmax0,variancet),1)

How to pass a vector(ndarray) using iterator without including it as input data for the neural network?
#2

Hi @asish98,

First I would recommend not to use the symbol API but to use Gluon which is a lot easier to work with and come with a model zoo.

For example you get a pretrained resnet just like this:

net = mx.gluon.model_zoo.vision.resnet50_v2(pretrained=True)

To get back to your question. Once you have a symbolic graph, you need to bind it to some data shapes and then send data through it. You cannot multiply a tensor (ndarray) with a symbolic graph (symbol).

https://github.com/khetan2/MBEM/blob/master/resnet.py#L120 look here how to create a module from symbol. Then you can call .forward(data=) on it and get the output and that will be a ndarray that you will be able to multiply with something else.

#3

Actually this is not the way we use symbolic code. We define symbol using all operations we want and then bind the symbol using the data(ndarray) by using .bindor .simple_bind and then using .forward
Here’s a tutorial:

#4
var_data = mx.sym.Variable(name= 'var_data')
var_data = var_iter.provide_data
c = mx.sym.sum(mx.sym.broadcast_mul(softmax0,var_data),1)
ex = c.bind(ctx=mx.cpu(), args={'var_data' : var_data,' softmax0' : softmax0})
ex.forward()

did this and yet getting the error
Argument rhs must be Symbol instances, but got [DataDesc[0,(500, 10L),<type ‘numpy.float32’>,NCHW]]

#5

@mouryarishik
var_data = mx.sym.Variable(name= ‘var_data’)
var_data = var_iter.provide_data
c = mx.sym.sum(mx.sym.broadcast_mul(softmax0,var_data),1)
ex = c.bind(ctx=mx.cpu(), args={‘var_data’ : var_data,’ softmax0’ : softmax0})
ex.forward()

did this and yet getting the error
Argument rhs must be Symbol instances, but got [DataDesc[0,(500, 10L),<type ‘numpy.float32’>,NCHW]]