Reshape a tensor when both the data and the shape are inputs to an operator

Currently the reshape operator in MXNet needs the shape attribute beforehand, which will be used as a parameter of the reshape operator.
And the Reshape_like operator takes two symbols as inputs and reshapes the first input based on the shape of the second input.

But if I have two inputs where the second input is a shape tuple, and my input needs to be reshaped based on this shape tuple. How would I achieve this?

Hi @anirudhacharya,

Given the symbolic graph allocates memory for the data up front I don’t see how this is possible in Symbols if you have a dynamically derived shape tuple. It’s not possible to implement shape inference in this situation, as the shape cannot be derived from neighbors. You’ll have access to the shape on the forward pass if you were to implement a mx.operator.CustomOp (see this tutorial) as you’re dealing with NDArrays but you won’t be able to implement infer_shape used for upfront memory allocation.

With Gluon it’s simple to create a Block for this, but the shape tuple will have to be transferred from the device to the cpu so that it can be used for the reshape command, which might impact performance. You won’t be able to hybridize this Block.

1 Like

As a continuation to this conversation -

Pytorch supports this feature and with MXNet 1.2 when importing ONNX models exported from pytorch we can encounter the following error -

Invalid Parameter format for shape expect Shape(tuple) but value=‘’, in operator Reshape(name=“”, shape=“”)

It is fixed with a workaround in the master branch. The real fix will be the implementation of this feature - Dynamic shape - MXNet - Apache Software Foundation

As a workaround, you might want to consider inputting not the shape, but a tensor of zeros with the corresponding shape. Then you can either use a reshape_like, or if you need broadcasting, a broadcast add with the zeros.