Implementing ResNet with MXNET Gluon and for image classification

Implementing ResNet with MXNET Gluon and for image classificationCecelia ShaoBlockedUnblockFollowFollowingDec 6In this tutorial, we will illustrate how to build an image recognition model using a convolutional neural network (CNN) implemented in MXNet Gluon, and integrate for experiment tracking and monitoring..We will be using the MXNet ResNet model architecture and training that model on the CIFAR 10 dataset for our image classification use case.MXNet is a deep learning framework that offers optimizations for multi-GPU training and accelerates model development with predefined layers and automatic differentiation..We also define the Comet experiment at the top with the API Key you obtained in Step 2 above and your project and workshop name.from comet_ml import Experimentimport argparseimport timeimport loggingimport numpy as npimport mxnet as mxfrom mxnet import gluon, ndfrom mxnet import autograd as agfrom mxnet.gluon import nnfrom import transformsfrom gluoncv.model_zoo import get_modelfrom gluoncv.utils import makedirs, TrainingHistoryfrom import transforms as gcv_transformsfrom sklearn.metrics import confusion_matriximport itertoolsimport matplotlib.pyplot as pltplt.switch_backend('agg')experiment = Experiment(api_key="<YOUR API KEY>", project_name="mxnet-comet-tutorial", workspace="<YOUR WORKSPACE>")As noted in Step 1, we recommend using at least 1 GPU to train this model..A good learning rate can help cut down on the time it takes to train the model.lr_decay = opt.lr_decaylr_decay_epoch = [int(i) for i in opt.lr_decay_epoch.split(',')] + [np.inf]For our model, we’re using the cifar_restnet20_v1 model architecture available from the MXNet gluoncv model zoo..Click on this experiment url to see your model training results.Monitoring results inside the Comet UIAs an example, we’ve logged the results in a public Comet project: can actually observe the training and validation accuracy plots update in real-time as results come in.We also want to make sure we’re actually using our GPU, so we can go to the System Metrics tab to check memory usage and utilization.The script we ran and the output we saw after running it can be found on the Code and Output tabs, respectively.You’ll see some noticeable bumps in accuracy at epoch 80 because we set our learning rate decay to occur at epoch 80 and 160..Simply look at the higher values in the confusion matrix to identify where the model can be improved (perhaps by collecting more data around these specific classes)Our first model performed very well with a high training accuracy around 0.9941..Having the two experiments in the same project will allow us to begin conducting meta-analysis on our model iterations with higher-level visualizations and queries.Our second experiment has significantly worse results with a training accuracy of 0.852 and a validation accuracy of 0.8184..To summarize the tutorial highlights, we:Trained a MXNet Gluon resnet_v1 model on the CIFAR-10 dataset to learn how to classify images into 10 classes using this scriptExplored a different iteration of the model training experiment where we increased the batch size and compared our two versions in Comet.mlUsed to automatically capture our model’s results (training accuracy and validation accuracy), training code, and other artifactsYou can see a quick start guide on here and learn more about MXNet here.Bonus Notes:Try using a different pre-trained model from MXNet Gluon Model Zoo. More details

Leave a Reply