AutoML is also known as Automatic Machine Learning. In the year 2018, Google launched cloud AutoML which gained a lot of interest and is one of the most significant tools in the field of Machine Learning and Artificial Intelligence. In this article, you will learn “AutoML” a no code solution for building machine learning models with help of Google cloud AutoML.
AutoML is a part of Vertex AI on the Google Cloud Platform. Vertex AI is the end-to-end solution for building and creating machine learning pipelines on the cloud. However, we will discuss the details of Vertex AI in a future article. AutoML mainly depends on two things one is transfer learning and neural search architecture. You just need to provide the data post that AutoML will build an optimal custom model for your use case.
In this article, we will discuss the benefits, usage and practical implementation of AutoML with Python code on the Google Cloud Platform.
This article was published as a part of the Data Science Blogathon.
Building a machine learning model is a time consuming process and requires a lot of expertise such as proficiency in a programming language, good knowledge of mathematics and statistics, and an understanding of machine learning algorithms. In past, people with technical skills could only work in data science and build models. For non-technical people building a machine learning model was a most difficult task. However, the path was not easy for technical persons who built models. Once the model is built, its maintenance, deployment, and autoscaling require additional efforts, man-hours and require a slightly different set of skills. To overcome these challenges global search giant Google launched AutoML in 2014 but it was publically made available later.
AutoML support unstructured and structured data that is categorized into four types
with these four data types, you can perform certain activities supported by AutoML.
With the image dataset, you can perform the below task in AutoML
With a tabular dataset, you can perform the following task:
You can perform the below activities with the video dataset
AutoML text data support the below task:
To use AutoML, one should have an account on the Google Cloud Platform. Account set-up is a very simple process, just go to the
URL https://console.cloud.google.com/ and click on join, it will ask for your Gmail email id and password and an account gets created on GCP. Click on the search bar and search for Vertex AI, in the left side you will see all components of Vertex AI, click on workbench.
Workbench provides you with a jupyter lab where you can create a notebook instance on the cloud using a virtual machine. Select the “USER-MANAGED NOTEBOOKS” instance and click on “NEW NOTEBOOK”, choose Python 3 and leave the default settings as it is, It will take two to three minutes and a Jupyter Lab will be created for you. You can also create a tensorflow and pytorch instance with or without GPU. Click on “OPEN JUPYTERLAB” then click on Python 3 (ipykernel) from the Notebook section. Your Jupyter notebook is ready; now you can write code similar to your local Python Jupyter notebook.
We will create a tabular classification model for the demo using the AutoML client library in Python.
First, you need to install the two packages.
!pip install --upgrade google-cloud-aiplatform
!pip install --upgrade google-cloud-storage
Once these two packages are installed successfully, restart the kernel. You can restart the kernel in two ways, One is from the user interface, select the “Kernel” tab from the top bar and click “Restart Kernel”, The second option is by programmatically.
#restart the kernel
import os
if not os.getenv("IS_TESTING"):
import ipython
app = Ipython.Application.instance()
app.kernel.do_shutdown(True)
Set your project id, bucket name and region. If you don’t know your project id, run the below code to get to know your google cloud project id using gcloud command.
import os
PROJECT_ID = ''
if not os.getenv("IS_TESTING"):
proj_output = !gcloud config list --format 'value(core.project)' 2>/dev/null
PROJECT_ID = proj_output[0]
print("Project ID: ", PROJECT_ID)
#set project id, bucket name and region
PROJECT_ID = '@YOUR PROJECT ID' #from the above code you can get your project id
BUCKET_NAME = 'gs://PROJECT_ID' #you can set your own bucket name
REGION = 'us-west1' #change the region if different
Why do we need a bucket name? In AutoML you can upload the data using three ways:
In this example, we are uploading the dataset from cloud storage for that we need to create a bucket where we will upload our CSV file.
Create a bucket in cloud storage and set the data path from google cloud storage.
#using gsutil command we can create a bucket in cloud storage
! gsutil mb -l $REGION $BUCKET_NAME
#checking if the bucket created
! gsutil ls -al $BUCKET_NAME
#dataset path in gcs
IMPORT_FILE = 'data.csv'
gcs_path = f"{BUCKET_NAME}/{IMPORT_FILE}"
Now, we need to create a dataset in AutoML post that we train the model on the dataset.
#import necessary libraries
import os
from google.cloud import aiplatform
#initializing the AI platform
aiplatform.init(project=PROJECT_ID, location=REGION)
#creating dataset in AutoML
ds = aiplatform.TabularDataset.create(
display_name = 'data_tabular', #set your own name
gcs_source = gcs_path)
#create a training job in AutoML to run the model
job = aiplatform.AutoMLTabularTrainingJob(
diaply_name = '#set your own name',
optimization_prediction_type = 'classification',
column_transformations = [
{'categorical' : {'column_name': 'City'}}, #just randomly given the name
{'numeric' : {'column_name': 'Age'}},
{'numeric' : {'column_name': 'Salary'}}])
#run the model
#this will take time, depending on your dataset
model = job.run(
dataset = ds,
target_column = Adopted,
training_fraction_split = 0.8,
test_fraction_split = 0.2,
model_display_name = '#give your own name',
disable_early_stopping = False)
Once training is done we will deploy our model using endpoint. Endpoint is one of the components of Vertex AI where you can deploy your mode and make online predictions.
#deploying the model
endpoint = model.deploy(machine_type = 'n1-standard-4')
This will take a few minutes. While creating an endpoint instance choose your machine type wisely as this will incur the cost. Setting a low machine type results in fewer fees, whereas setting a high machine type results in more costs. For more clarity on pricing, please check out the below link.
https://cloud.google.com/products/calculator#id=9c1e6e38-ba1e-4b40-b1e4-52c86bb9ab29
#making prediction
pred = endpoint.prediction([
{'City': 'Madrid',
'Age': 52,
'Salary': 70000}])
print(pred)
Google Cloud AutoML is a powerful tool that anyone can use to build Machine Learning models without writing code. AutoML has a very interactive user interface from where you can build and deploy the model without extensive knowledge of algorithms and coding. However, the key takeaways from this article are:
A. No, AutoML will not take the job of Data Scientist. AutoML has a lot of potentials and automates Machine Learning, but if we want to build a custom model with total control of the code, we need a Data Scientist’s expertise.
A. Pre-built APIs use a pre-built ML model and AutoML uses a custom-built ML model.
A. Yes, Anyone can use AutoML and build the Machine Learning model on the Google Cloud.
A. It depends on the use case and cloud services you are going to use.
A. Vertex AI is an ML suite of Google Cloud which provide an end-to-end solution for building, deploying and creating Machine Learning and Artificial Intelligence pipeline on the cloud. AutoML is one of the components of Vertex AI.
The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.