I am defining a network that will have many convolutional layers all taking the same input. gluon.hybridsequential
blocks make it very easy to build deep networks, however, I cannot see an elegant way to define a wide network with gluon.
Here is an example. In WideNet
I need to define each ConvPool
block one by one in __init__
and then apply the convolution one by one in hybrid_forward
.
class ConvPool(gluon.nn.HybridBlock):
def __init__(self, channels, kernel_size, padding):
super().__init__()
self.conv = gluon.nn.Conv1D(channels, kernel_size, strides=1)
self.maxpool = gluon.nn.GlobalMaxPool1D()
def hybrid_forward(self, F, x, *args, **kwargs):
c = self.conv(x)
return self.maxpool(c)
class WideNet(gluon.nn.HybridBlock):
def __init__(self, filters=[3, 4, 5], num_filter=50):
super().__init__()
self.conv1 = gluon.nn.Conv1D(channels=num_filter, kernel_size=filters[0], strides=1)
self.conv2 = gluon.nn.Conv1D(channels=num_filter, kernel_size=filters[1], strides=1)
self.conv3 = gluon.nn.Conv1D(channels=num_filter, kernel_size=filters[2], strides=1)
def hybrid_forward(self, F, x, *args, **kwargs):
c1 = self.conv1(x)
c2 = self.conv2(x)
c3 = self.conv3(x)
c = F.concat(*[c1, c2, c3], dim=1)
return c
Instead I would like to define the block so that it can take list filters
of any size, without manually needing to change the block. For example WideNet(filters=[3, 4, 5, 6, 7, 8, 9])
.
So far the best I can find is this but it’s not ideal.