environment : python 3.6.4 / mxnet-cu91 / GTX1070
hi everyone , i’m trying to run the code written in chapter “deep CNN” in tutorials(https://gluon.mxnet.io/chapter04_convolutional-neural-networks/deep-cnns-alexnet.html)
It’s a alexnet model with cifar10 dataset , i’ve tried different batch size (64/128/512) ,but gpu usage always be at about 20% and running at about 30% TDP (gpu memory usage is pretty high at 5886MB of 8 gigs when batch size equals to 512) , it’s depressing.
am i doing something wrong? could anyone help,thanks