Calculate number of trainable parameters in network

Hi,

Is there any way to get the exact number of trainable parameters that are used in any given network. Preferably calculated from the symbol.json file.

Thanks,
D

Hi @dfferstl,

Check out the summary method of Block. And if you’re interested in using a symbol.json file that describes the architecture, load it into a SymbolBlock and call summary on that.

import mxnet as mx

net = mx.gluon.model_zoo.vision.resnet18_v2()
net.initialize()
net.summary(mx.nd.random.uniform(shape=(10, 3, 100, 100)))

And the following will be output: showing 11687848 trainable params, and 7948 non-trainable params.

--------------------------------------------------------------------------------
        Layer (type)                                Output Shape         Param #
================================================================================
               Input                           (10, 3, 100, 100)               0
         BatchNorm-1                           (10, 3, 100, 100)              12
            Conv2D-2                            (10, 64, 50, 50)            9408
         BatchNorm-3                            (10, 64, 50, 50)             256
        Activation-4                            (10, 64, 50, 50)               0
         MaxPool2D-5                            (10, 64, 25, 25)               0
         BatchNorm-6                            (10, 64, 25, 25)             256
            Conv2D-7                            (10, 64, 25, 25)           36864
         BatchNorm-8                            (10, 64, 25, 25)             256
            Conv2D-9                            (10, 64, 25, 25)           36864
     BasicBlockV2-10                            (10, 64, 25, 25)               0
        BatchNorm-11                            (10, 64, 25, 25)             256
           Conv2D-12                            (10, 64, 25, 25)           36864
        BatchNorm-13                            (10, 64, 25, 25)             256
           Conv2D-14                            (10, 64, 25, 25)           36864
     BasicBlockV2-15                            (10, 64, 25, 25)               0
        BatchNorm-16                            (10, 64, 25, 25)             256
           Conv2D-17                           (10, 128, 13, 13)            8192
           Conv2D-18                           (10, 128, 13, 13)           73728
        BatchNorm-19                           (10, 128, 13, 13)             512
           Conv2D-20                           (10, 128, 13, 13)          147456
     BasicBlockV2-21                           (10, 128, 13, 13)               0
        BatchNorm-22                           (10, 128, 13, 13)             512
           Conv2D-23                           (10, 128, 13, 13)          147456
        BatchNorm-24                           (10, 128, 13, 13)             512
           Conv2D-25                           (10, 128, 13, 13)          147456
     BasicBlockV2-26                           (10, 128, 13, 13)               0
        BatchNorm-27                           (10, 128, 13, 13)             512
           Conv2D-28                             (10, 256, 7, 7)           32768
           Conv2D-29                             (10, 256, 7, 7)          294912
        BatchNorm-30                             (10, 256, 7, 7)            1024
           Conv2D-31                             (10, 256, 7, 7)          589824
     BasicBlockV2-32                             (10, 256, 7, 7)               0
        BatchNorm-33                             (10, 256, 7, 7)            1024
           Conv2D-34                             (10, 256, 7, 7)          589824
        BatchNorm-35                             (10, 256, 7, 7)            1024
           Conv2D-36                             (10, 256, 7, 7)          589824
     BasicBlockV2-37                             (10, 256, 7, 7)               0
        BatchNorm-38                             (10, 256, 7, 7)            1024
           Conv2D-39                             (10, 512, 4, 4)          131072
           Conv2D-40                             (10, 512, 4, 4)         1179648
        BatchNorm-41                             (10, 512, 4, 4)            2048
           Conv2D-42                             (10, 512, 4, 4)         2359296
     BasicBlockV2-43                             (10, 512, 4, 4)               0
        BatchNorm-44                             (10, 512, 4, 4)            2048
           Conv2D-45                             (10, 512, 4, 4)         2359296
        BatchNorm-46                             (10, 512, 4, 4)            2048
           Conv2D-47                             (10, 512, 4, 4)         2359296
     BasicBlockV2-48                             (10, 512, 4, 4)               0
        BatchNorm-49                             (10, 512, 4, 4)            2048
       Activation-50                             (10, 512, 4, 4)               0
  GlobalAvgPool2D-51                             (10, 512, 1, 1)               0
          Flatten-52                                   (10, 512)               0
            Dense-53                                  (10, 1000)          513000
         ResNetV2-54                                  (10, 1000)               0
================================================================================
Parameters in forward computation graph, duplicate included
   Total params: 11695796
   Trainable params: 11687848
   Non-trainable params: 7948
Shared params in forward computation graph: 0
Unique parameters in model: 11695796
--------------------------------------------------------------------------------
4 Likes

Great, that works. Thanks for the quick response.
Is there a possibility to get this data also as dict? There is no return value of net.summary

Can’t see an obvious way but inside the summary function there’s a variable called summary that’s an OrderedDict (see here). Should be quite easy to change this function to return the summary dictionary, and that would be useful for others too. You should definitely make a pull request if you’re willing and able! Just give me a shout if you need any pointers.

1 Like