Timedistributed style layer in mx.sym?


#1

I’m looking for the functionality of Keras’ timedistributed layer in mxnet’s symbol workflow. For example run data of (batch_size, seq_length, channels, height, width) through a bunch of 2D convolutional layers (that are the same layers with the same states), then pool the results on seq_length at the end.

Anyone aware of a way to do this?

Thanks :slight_smile:


#2

MXNet does not have a time-distributed layer like Keras, but you can use a Dense layer and set flatten=false. The Dense layer works then the same like Keras’ time-distributed layer.


#3

Thanks for the response, how does this work in the sense of a wrapper? could you give a brief example of its usage in the case where you have a symbol that is wrapped by this dense layer.

Thanks


#4

You could do something like the following:

data = mx.sym.Variable('data')
fc1 = mx.sym.FullyConnected(data=data, flatten=False, num_hidden=20,  name ='fc1')

assuming that the place holder of your input data is directly followed by the Dense layer. In the Dense layer you have to set flatten=False