Custom parameter is dumped during a forward call

I want to switch from Mathematica to Python/MxNet for a neural network project using Munsell color notation. I am very experienced in C++, somewhat experienced in Mathematica but next to no experience in Python. My Ms Thesis was about using neural nets for finding and replacing missing values in databases and I also worked on authonomous robot navigation using Neural Nets but my career was really in 3D rendering. Now that I am retired, I want to return to neural networks.

I’m following the MxNet tutorials and I’m having a difficulty with the “Custom Layers (Beginners)” tutorial. I traced the issue to either the declaration of the usage of the “scales” custom parameter.

 class NormHybridLayer( gluon.HybridBlock ):
    def __init__( self, hidden_units, scales ):
       super( NormHybridLayer, self ).__init__()
       with self.name_scope():
          self.weights = self.params.get( 'weights',
                                          shape = ( hidden_units, 0 ),
                                          allow_deferred_init = True )
          self.scales = self.params.get( 'scales',
                                         shape = scales.shape,
                                         init = mx.init.Constant( scales.asnumpy() ),
                                         differentiable = False )
    def hybrid_forward( self, F, x, weights, scales ):
       normalized_data = F.broadcast_div( F.broadcast_sub( x, F.min( x ) ), ( F.broadcast_sub( F.max( x ), F.min( x ) ) ) )
       weighted_data = F.FullyConnected( normalized_data, weights, num_hidden = self.weights.shape[ 0 ], no_bias = True )
       scaled_data = F.broadcast_mul( scales, weighted_data )
       return scaled_data
 net2 = gluon.nn.HybridSequential()
 with net2.name_scope():
    net2.add( Dense( 5 ) )
    net2.add( NormHybridLayer( hidden_units = 5, scales = nd.array( [ 2 ] ) ) )
    net2.add( Dense( 1 ) ) 
 net2.initialize( mx.init.Xavier( magnitude = 2.24 ) )

I followed the tutorial and when
output = net2( input )

is called, I get an error ending in
TypeError: Object of type ndarray is not JSON serializable

I traced the error to the ‘var’ function in, line 2651. From what I see there, I concluded that one of the custom parameter “init” is wrong somehow and a dump is called, which results in this TypeError.

I figured the fault parameter is “scales” by commenting it out and running the code, which then ran without error.

With my lack of experience in Python and MxNet, I can’t figure what is wrong and I would appreciate any help in figuring this out. For now, I’m going to set this tutorial aside and move to the next tutorial…

I solved my issue and post it here for future new users trying to follow the tutorial.

There is an error in the tutorial source example in the NormalizationHybridLayer class declaration. The line that reads
should actually read
The fix is given by @thomelane in the " How to create symbol parameters?" MxNet forum thread.

It would be nice if the tutorial was fixed.