Dear all,
I want to use the same weights (only) across different layers. For example, assuming I have two layers (say a convolution and a Dense layer) that depend on the same weight variable (assuming again that dimensions are appropriately chosen):
class Layer1(HybridBlock):
def __init__(self, some_weight, some_params, **kwargs):
super().__init__(**kwargs)
with self.name_scope():
self.weight = some_weight # this is supposed to be a Parameter
def hybrid_forward(self, F, input):
out = # do some things to input with weight
class Layer2(HybridBlock):
def __init__(self, some_weight, some_other_params, **kwargs):
super().__init__(**kwargs)
with self.name_scope():
self.weight = some_weight # this is supposed to be a Parameter
def hybrid_forward(self, F, input):
out = # do some OTHER things to input with the SAME weight
I am aware of e.g. creating a single convolution layer and using it in different parts of the network inside the hybrid_forward
function, but this unfortunately does not fall into this category.
Thank you for your time.