MXNet digest: June 2018


#1

To keep update with what’s new in the world of MXNet, check out these channels:

Twitter: @ApacheMXNet
Medium: apache-mxnet
Youtube: apachemxnet
Reddit: r/mxnet

Here’s a digest of the tutorials, blogs, videos and announcements about MXNet during June 2018.

Announcing Keras-MXNet v2.2
Use MXNet as the backend for Keras. What happens next will shock you. (Spoiler alert: faster training time, better memory management and all the other goodness you expect with MXNet as your engine.)

The importance of hyperparameter tuning for scaling deep learning training to multiple GPUs
If you move from training on a single GPU to training on multiple GPUs you need to tune hyperparameters such as batch size and learning rate.

Saving and Loading Gluon Models.
Step by step guide to loading and saving models and parameters.

Scala API for Deep Learning Inference Now Available with MXNet v1.2
Fixes to Scala memory issues, improvements to the API and an upgrade to the MXNet 1.2 engine.

Computer Vision using Scala/MXNet
A series of tutorials on CV using Scala inference API

Train using Keras-MXNet and inference using MXNet Scala API
Get the best of two worlds: train in a Pythonic world then deploy for inference in a JVM world using Scala.

Page Segmentation with Gluon
If you want to do OCR on handwritten text, first you have to segment the page to find the sections.

Profiling MXNet Models
How-to guide to really understanding what’s going on in your models.

Inference using ONNX Model Zoo
Grab a pretrained model from the ONNX model zoo (which may have been trained and saved in any framework that supports the ONNX interchange format) load it in MXNet and do inference.

Getting Started with SageMaker
A step-by-step guide to training MXNet models using Sagemaker.

Learning Rates
A three-part series about the state of the art in learning rates, learning rate schedules and finding optimal learning rates for faster convergence:


#2

A pseudo-reply as I filter for unanswered topics.