Arbitrary size layer specification


#1

In the __init__ function for a Block/Hybrid Block how do we let the number of layers be an arbitrary specification?

If I do:

self.fcs = [nn.Dense(layer, activation='relu') for layer in self.layer_sizes]

And then in the forward/hybrid_forward:

def forward(self, x):
    for layer in self.fcs:
        x = layer(x)
    return x

#2

List of Blocks won’t be registered automatically.
You need to call register_child on each of them


#3

Or you can use a Sequential layer