Google AutoML

Automated Machine Learning Model

Automated Machine Learning Model is the process of applying automated process for Machine Learning Model.It generally cover the automated approach from Raw Data set to deployable Machine Learning Model including Data Preprocess steps, Model Building, Model Ewvaluation, Model Training and Testing etc. This approaches are great in reducing full development life cycle of Machine Learning Model and technique without requiring much expertise in this field.All you need is the information of how to use services offered by this Cloud based tool and basic overview of AI-ML models working.

Machine learning (ML) has achieved considerable successes in recent years and an ever-growing number of disciplines rely on it. However, this success crucially relies on human machine learning experts to perform the following tasks:

  • Preprocess and clean the data.
  • Select and construct appropriate features.
  • Select an appropriate model family.
  • Optimize model hyperparameters.
  • Postprocess machine learning models.
  • Critically analyze the results obtained. But with AutoML technique we can save a lot of time for the given mentioned steps.

Some of the popular AutoML tools are : AutoKeras, H20AutoML, Amazon Lex, AWS Sagemaker, Google AutoML, IBM Watson, Microsoft Azure and many more. We will be covering how to use Google AutoML and its services.

How To Use Google AutoML: We will be using Jupyter Notebook for Google AutoML working. We will showing how you can use Jupyter Notebook or Python for using Google AutoML services

  1. Create or Select a GCP(Google Cloud Platform) Project : Create your account in Google Cloud. You will be provided with some free credits. Use them for doing all stuff for your first AutoML working project in Google.
  2. Enable Billing: When you are done with Enable billing you will be provided with some free credits
  3. Enable the API’s for AutoML and Google Storage: Select the Project that you have created at step 1 and enable the API’s for that.
  4. We will be using Data from site: https://www.figure-eight.com/data-for-everyone/ where we will be using Text Data from tweets: https://twitter.com/AnyOtherAnnaK/status/629195955506708480 to identify Real or fake tweets.
  5. Create a storage bucket at storage section at Google AutoML console which will be used for storing and getting data from GCS
  6. Import important library for Google AutoML
  7. GCS upload/download utilities/function: These functions make upload and download of files from the kernel to Google Cloud Storage easier. This is needed for AutoML. We have taken help from the Google AutoML documentation for this
  8. Export to CSV and Upload your data to GCS storage bucket
  9. Create a class Instance for using Natural Language Service
  10. Create Data set
  11. Start with created data set for training model
  12. Once the model is trained , make the prediction with that.
  13. You can refer to my Kaggle Notebook for practical demo: https://www.kaggle.com/kamalnaithani/automl-tweets-identification
Storage Bucket Pic
Model Evaluation

For code and how to make use of Python for using Google AutoML services, follow my Notebook: https://www.kaggle.com/kamalnaithani/automl-tweets-identification.

Please provide your feedback

Regards

Team Kite4sky

Advertisement

2 thoughts on “Google AutoML

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: