How to get product of Jacobian and vector

I am trying to use autograd to calculate the product of a Jacobian matrix and a vector, but could not make it work efficiently. Any suggestion will be highly appreciated:
Assume x is a M dimensional vector. I have N functions y1, y2, ... yn such that y1 = y1(x), y2 = y2(x), ..., yn = yn(x). The derivative of y with respect to x then form a N x M Jacobian matrix. Let’s call the Jacobian matrix J.
I am interested in calculating the product between J and v, a M-dimensional vector, i.e. Jv. Since autograd gives us the opportunity to calculate gradients, I thought it can be done. What I realize that is that, while gradients were computed, the backward() immediately adds up dy1/dx1, dy2/dx1, …, dyn/dx1 before I can have a chance to calculate Jv.

The only way I can see is to build a loop to iterate y1, y2, ..., yn, which is quite inefficient (in my opinion). I am looking for more efficient ways. Any feedback is welcome.

P.S. the python code below shows a simple example with M = 3 and N = 2 linear functions. Since the function is linear, the Jacobian is essentially the weight matrix. The desirable output is thus product of weight and v.

# Inputs
dim_input = 3
# number of scalar functions
num_function = 2

x = nd.arange(dim_input)
print("input x:", x)

# linear weights for every function
weights = nd.arange(num_function*dim_input).reshape(shape=(num_function, -1)) - 3.5
print("weights:", weights)

v = nd.arange(dim_input)+100
print("vector v:", v)

x.attach_grad()
with autograd.record():
    y = nd.dot(x, weights.T)    
y.backward()    
grad = x.grad    
print("function output: ", y)
print("Gradient -- cannot get Jacobian: ", grad)    

print("Target result:", nd.dot(v.T, weights.T))