How to create your own Deep Learning Project in Azure

How to create your own Deep Learning Project in AzureUsing Azure Storage, Databricks with Keras and Azure ML ServiceRené BremerBlockedUnblockFollowFollowingApr 151.

IntroductionDeep learning has a lot of practical applications for companies such as image recognition, video indexing and speech to text transcription.

However, it can be daunting for companies to start with deep learning projects.

Common issues are sensitivity of data used and the complexity of deep learning, which can be seen as the superlative of machine learning.

2.

ObjectiveIn this tutorial, a sample deep learning project is created that is able to recognize classes of pictures using the CIFAR-10 dataset (plane, frog, ship).

In this, the following steps are executed:Azure Storage is used to securely store the picturesAzure Databricks is used to train the model using Keras and TensorFlowAzure ML Service is used to version and deploy model as HTTP endpointThis can be depicted in the following architecture overview:2.

High level overviewIn the remainder of this blog, the following steps will be executed:3.

Prerequisites4.

Create Deep Learning project5.

Deploy Deep Learning model6.

ConclusionIt is a standalone tutorial in which the focus is to set up your own deep learning project in Azure, to “get your hands dirty” and so to get more familiar with subject.

The focus is less on the internal working of deep learning, latest algorithms or Computer Vision APIs.

In case you more interested in devops for AI, refer to the previous blogs, here and with focus on security, see here.

3.

PrerequisitesThe following resources need to be created in the same resource group and in the same location.

Azure Storage AccountAzure Databricks Workspace.

Azure Machine Learning Service4.

Create Deep Learning projectThe following steps will be executed in this part.

4a.

Add pictures to storage account4b.

Create deep learning cluster4c.

Mount storage account4d.

Train deep learning model on single node4e.

Terminate deep learning cluster4a.

Add pictures to storage accountIn this tutorial the CIFAR-10 data is used to train the deep learning model.

In the first model, a subset of 2000 pictures will be used.

Subsequenlty, a second model will be created that uses the full CIFAR-10 dataset of 60000 pictures.

To illustrate how to use Azure Storage in combination with Azure Databricks, the subset of 2000 pictures will be stored in the storage account.

Therefore, go to the following URL to download a subset of 2000 pictures in a zip file.

https://github.

com/rebremer/devopsai_databricks/raw/master/sampledata/2000pics_cifar10.

zipGo to the Azure portal and select your storage account.

Then select blobs and create a new container named “2000picscifar10”.

Subsequently, upload in the zip file you downloaded earlier into your container.

4a1.

Add zipfile to new containerSubsequently, go to Access Keys and copy the key or your storage account.

4a3.

Copy access keys4b.

Create deep learning clusterGo to your Azure Databricks workspace and go to Cluster.

Since the model will be trained on the driver nodes without using Spark jobs, it is not needed to create (and pay for) worker nodes.

Therefore, create a new GPU cluster with the following settings:4b1.

Create GPU cluster without worker nodes4c.

Mount storage accountGo to your Azure Databricks workspace, right-click and then select import.

In the radio button, select to import the following notebook using URL:https://raw.

githubusercontent.

com/rebremer/devopsai_databricks/master/project/modelling/0_mountStorage.

pySee also picture below:4c1.

Import notebook to mount storageOpen your nodebook and change the following settingsstorageaccount="<<your_storage_account>>"account_key="<<your_key_in_step4a>>"containername="2000picscifar10" # change if you used another nameNotice that keys shall never be stored in a notebook in a production situation.

Instead, secret scope shall be used, see also my blog how to embed security in data science projects.

Then attach notebook to cluster you created and then click SHIFT+ENTER to run to it cell by cell.

4c2.

Attach notebook to clusterIn this notebook the following steps will be excuted:Mount storage account to Azure Databricks WorkspaceUnzip pictures in storage accountList and show pictures4d.

Train deep learning model on single nodeGo to your Azure Databricks workspace again, right-click and then select import.

In the radio button, select to import the following notebook using URL:https://raw.

githubusercontent.

com/rebremer/devopsai_databricks/master/project/modelling/1_Cifar10KerasNotebookExplore.

pyIn this notebook the following steps will be excuted:Import and process data from storage accountBuild model on 2000 pictures in storage accountBuild model on dataset of 60000 pictures of full CIFAR-10 datasetSave model to disk (dbfs)When you run the notebook succesfully, you can see an overview of the predictions (red are wrong predictions).

4d1.

Overview of predictions4e.

Terminate deep learning clusterRunning GPU clusters can be costly.

Since we do not need the GPU cluster in the remaining of this tutorial, we can stop it.

Therefore, go to your cluster and select terminate.

4e1.

Terminate cluster5.

Deploy Deep Learning projectThe following steps will be executed in this part.

Notice that Azure Portal is used whenever possible.

5a.

Create new cluster in Databricks5b.

Add libraries to cluster5c.

Register model and log metrics5d.

Create an HTTP endpoint of model5e.

Test HTTP endpoint of model5a.

Create new cluster in DatabricksIn this step, a new cluster is created that will be used to deploy our model.

Since the model is already trained, we don’t need GPUs anymore.

Select a cluster with the following settings5a1.

Non GPU cluster5b.

Add libraries to clusterTo deploy our model, we need a couple of libraries.

Go to your shared folder, right click in the shared and select to “create library”.

4b1.

Add LibrariesSubsequently, select Pypi and add the following libraries to shared folder:azureml-sdk[databricks]kerastensorflow5c.

Register model and log metricsIn this step, the model and its metrics wil be added to your Azure ML service workspace.

Import the following notebook in your Azure Databricks workspace.

https://raw.

githubusercontent.

com/rebremer/devopsai_databricks/master/project/modelling/2a_Cifar10KerasNotebookLogModel.

pyOpen your nodebook and change the following settings of your Azure ML service workspace (subscription id can be found in the overview tab of you Azure ML service workspace instance).

workspace="<<your_name_of_azure_ml_service_workspace>>"resource_grp="<<your_resource_group_amlservice>>"subscription_id="<<your_subscriptionid_amlservice>>"Then run the notebook, again using Shift+Enter to go through it cell by cell.

In this notebook the following steps will be excuted:Log metrics of model that was trained on 2000 picturesLog metrics of model that was trained on all 60000 picturesRegister best model to deployed in step 5dTo view the metrics, go the portal, select Azure Machine Learning Workspace and then Expirement, see below5c1.

Log metrics of model with 2000 pictures and all CIFAR-10 pictures5d.

Create an HTTP endpoint of modelImport the following notebook in your workspacehttps://raw.

githubusercontent.

com/rebremer/devopsai_databricks/master/project/modelling/2b_Cifar10KerasNotebookDeployModel.

pyAgain, change the parameters similar as in step 5c.

In this notebook, an endpoint is created of the model that was trained on all pictures.

When you go to portal, select Azure Container Instance, you will find the IP adress of the endpoint.

5d1.

Azure Container Instance with IP addresss5e.

Test HTTP endpoint of modelTo test the endpoint, the following steps will be done:Get a random png from the internet in one of the CIFAR-10 categoriesConvert png to base64 encoding using this websiteSend base64 payload with a tool like Postman to create predictions.

The following picture shows a picture of a ship, converted to base64 and sent with Postman to create prediction using endpoint.

5e1.

Picture of ship converted to base64 using https://onlinepngtools.

com/convert-png-to-base645e2.

Prediction of ship using HTTP endpoint6.

ConclusionIn this tutorial, a deep learning project was created in which the following services were used:Azure Storage account to securely store the picturesAzure Databricks with Tensorflow and Keras to build the modelAzure ML Service to keep track of themodel and create an HTTP endpointDeep learning has a lot of practical applications for enterprises.

However, it can be daunting for enterprises to start with deep learning projects.

Creating your own sample project and “getting you hands dirty” is a great way to learn and to get more familiar with the subject.

6.

Sample project overviewFinally, big thanks to my colleague Ilona Stuhler who was so kind to provide me crucial insides in the subject and to donate her project to me.

.

. More details

Leave a Reply