How is TinyML Used for Embedding Smaller Systems?

Rania Last Updated : 01 Dec, 2023
7 min read

Introduction

There are many emerging trends in the tech world, and Machine Learning is one of them. Machine Learning is a subset of Artificial Intelligence where a computer learns from data and analyses its patterns to predict an outcome. Usually, Machine Learning models are trained on big chunks of data to analyze the patterns where these complex models require hours or even days to get processed in the cloud centers. The resultant file of these models also contains a good amount of data. As we all know, data is constantly flowing. Therefore, most companies want to build larger machine learning models, which in turn means more data. This is where the problem comes in. How do we deploy these machine learning models on smaller embedded systems like microcontrollers or mobile phones, smartwatches, and so on?

Source: Unsplash

The answer to this lies in one emerging trend that is barely mentioned anywhere but has the power to change the world. It is one of the fastest-growing machine learning technologies: Tiny  ML. TinyML means Machine Learning done on tiny computers. The whole point of Tiny ML is for the smaller devices to benefit from Machine Learning.

Let’s take an example to understand this better. GPT-3 has 175 billion parameters and got trained on 570 GB of text. On the other hand, we have Google Assistant that can detect speech with a model that’s only 14 KB. So, that can fit on something as small as a microprocessor, but GPT-3 cannot. 

Learning Objectives

  1. Identify the main objective and benefits of TinyML.
  2. Apply the concept of embedded machine learning in different fields.
  3. Identify the basic tools required to do your TinyML project.

This article was published as a part of the Data Science Blogathon.

What is TinymL

TinyML, or Tiny Machine Learning, is a game-changer in technology, introducing the power of machine learning to compact devices. Think of it as giving a brain to everyday objects, enabling them to make intelligent decisions locally. This innovation is particularly impactful for devices like sensors and wearables, allowing them to function efficiently without constant reliance on the internet. With TinyML, your devices become smarter, analyzing data on the spot, all while prioritizing privacy. It’s a shift towards a future where our gadgets seamlessly integrate intelligence, making our lives more convenient and personalized.

What is the Need for TinyML?

The main objective of TinyML is to bring the field of Machine Learning to the domain of embedded systems. Microcontrollers play a big part in making it possible. From primary devices like calculators to high-end products like healthcare, they are used everywhere in our day-to-day life.

tinyML
Source: Allerin

Microcontrollers are tiny, low-cost integrated circuits designed to perform specific tasks of embedded systems. The benefits offered by these machines are:

  • Due to the small size of the microcontrollers, they consume relatively less power and are also quite reliable because of their enhanced memory.
  • With the help of microcontrollers, we can obtain edge computing. This technology helps perform the computation directly on the device instead of sending it over to the cloud centers.
  • Edge computing helps in fast data processing and gives results in real time.
  • The main advantage of edge computing is achieving low latency. Since the results are immediate, this can be helpful in situations where low latency is required. For example, self-driving cars process data all the time to avoid accidents. Here, latency can act as the difference between life and death.
  • The use of deep neural networks and machine learning algorithms on microcontrollers is also known as embedded machine learning.
  • In embedded machine learning, privacy is maintained because when data is processed on an embedded system and never sent to the cloud servers, the user’s privacy is protected with fewer risks.

How it is used for Embedding  Smaller Systems

TinyML models can be embedded into smaller systems through model deployment. This involves converting the trained machine learning model into a format that can be interpreted and executed by the target device’s hardware.

Benefits of Embedding TinyML

  • Reduced Latency: Local processing enables real-time decision-making, eliminating the need for cloud communication and reducing latency.
  • Improved Privacy: Data remains on the device, reducing the risk of data breaches and enhancing privacy.
  • Enhanced Efficiency: TinyML models are optimized for the device’s hardware, leading to efficient resource utilization and extended battery life.

Industrial Applications and Solutions of TinyML

TinyML led us to the collaboration of embedded systems and machine learning algorithms. This fusion has made many of our devices and appliances a lot smarter. Our smaller devices are capable of doing tasks that were previously only possible on our computers. The most common example of TinyML is the virtual assistant- Siri, Google Assistant, or Alexa. These virtual assistants execute the instructions given to them locally on the device with the help of a machine-learning model. Some of the other uses are as follows:

Industrial Applications

TinyML has a wide range of industrial applications because the use of embedded systems aids in the detection of faults in machines in real time. This has been useful in predicting when maintenance is required, and detecting defects in advance saves millions of dollars in maintenance costs.

Ping, an Australian startup, has introduced embedded machine learning to monitor wind turbines and alert authorities in the event of a malfunction or danger.

Agricultural Applications

The most famous use of the embedded system in agriculture is TensorFlow Lite. We will be talking more about this tool in the next section. With the help of TensorFlow Lite, farmers can upload pictures of their sick plants and detect the diseases associated with them. This works even without an internet connection, benefiting farmers in remote areas as it safeguards their agricultural interests. There is also an app called Nuru by PlantVillage, an open-source project run by Penn State University which provides similar services.

Healthcare Applications

The use of TinyML in the healthcare sector can lead to great inventions helping in the early detection of diseases that require immediate attention. One of them is – Pneumonia. According to world data, 2.5 million people died from pneumonia in 2019, and almost a third of all victims were children under five. A chest X-ray, blood oxygen levels, and a Complete Blood Count (CBC) are required to detect the disease, which is time-consuming and inaccurate sometimes.

Arijit Das, a 15-year-old tinyML enthusiast, created a tinyML model using the Edge Impulse platform, which detects pneumonia from Chest X-Rays in under a minute. This setup is incredible, given that manual inspection can take anywhere from 1 to 4 days.

A doctor in the Middle East uses embedded systems to detect benign and premalignant oral tongue lesions by automating the screening process. This makes the treatment faster and accessible to more patients.

tinyML

 

Source: MIT HAN Lab

Numerous other inventions like MaRTiny – a solution to sense extreme heat due to global warming. The technology also has applications in retail, traffic sense, factories, conservation of wildlife, etc. There is a lot in TinyML to explore and learn.

Getting Started with TinyML

After reading about the wonders possible with TinyML, you might be interested in getting started with your project. Here are some resources required to build your ML models on the tiniest microcontrollers.

  1. Let’s discuss the hardware requirements first. There are many great starter boards like Arduino Nano BLE Sense 33, SparkFun Edge Board, and AdaFruit AI Board from which you choose.
  2. These boards provide benefits like cost-efficiency, ease of programming, less power consumption, and so on. You’ll also need a Micro USB cable to connect the board to your desktop.
  3. The next step is figuring out which software tool will be compatible with the hardware. The most recommended tool is the TensorFlow Lite Micro, and this framework has diverse language support like python, C#, Java, etc.
  4. TensorFlow Lite provides users with tools to enable on-device machine learning on embedded systems. It is equipped with multiple platform support and high performance.
  5. You can do object detection, image classification, text classification, and more with it. In addition, with TensorFlow Lite’s help, you can also use Raspberry Pi, which gives some extra features for your applications instead of just sticking with microcontrollers.
  6. Arduino is the most common IDE for embedded machine learning. This software is compatible with many boards, and TensorFlow Lite Micro is also an official library found in the library manager.

These are just the basics, and you can learn in-depth about them from different courses available on the internet.

Frequently Asked Questions

Q1. What is TinyML used for?

A. TinyML refers to the field of deploying machine learning models on low-power and resource-constrained devices such as microcontrollers and IoT devices. It enables performing machine learning inference locally on these devices, eliminating the need for constant connectivity to the cloud. TinyML finds applications in areas like edge computing, smart devices, wearable technology, industrial IoT, and embedded systems where real-time, low-latency, and energy-efficient inference is required.

Q2. What is the difference between ML and TinyML?

A. Machine Learning (ML) is a wide-ranging field that involves creating algorithms and models to analyze data and make predictions. It covers a broad range of techniques used in various domains. On the other hand, TinyML is a specific branch of ML that focuses on deploying and running ML models on small devices like microcontrollers and IoT devices. TinyML addresses the unique challenges of performing ML on devices with limited resources and power constraints.

Q3.What language is TinyML?

TinyML is not tied to a specific language; it’s about using machine learning on small devices. It can be done in C or C++, adapting to the device’s needs.

Q4.How big is the TinyML model?

TinyML models are really small, usually just a few kilobytes. This makes them work well on devices with limited resources, like IoT gadgets and wearables

Conclusion

TinyML has the potential to bring a change in the coming years. The global tech market advisory firm ABI Research believes that the TinyML market will grow from 15.2 million shipments in 2020 to 2.5 billion in 2030. Even according to Harvard associate professor Vijay Reddi, TinyML has the potential to be big, really soon.

Here are some key takeaways from this article:

  • Due to its low cost, limited power consumption, and low latency, Tiny ML is deployable on many devices.
  • TinyML applications range from industries to healthcare and smart factories and have a lot of potential in the future.
  • Researchers also describe the future of TinyML as bright. Due to its wide range of uses, it will also contribute economically.

The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.

I am Rania, a second year student enrolled in CSE-AIML. I have a really keen interest in Machine Learning and I intend to keep up-skilling myself in this field. Apart from that, I have always been interested in reading and writing. Researching about the topics I am interested about is something I always look forward to.

Responses From Readers

Clear

Shakir salam ansari
Shakir salam ansari

Amazing …u have done a fantastic work here keep going

Congratulations, You Did It!
Well Done on Completing Your Learning Journey. Stay curious and keep exploring!

We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.

Show details