Neural Networks on all Battery Powered Devices Could Turn into Reality

Pranav Dar Last Updated : 16 Feb, 2018
2 min read

Most neural network models that we know of are huge and computationally heavy, which means they consume a lot of energy and are not practical for handheld devices. Almost all the smartphone applications (like speech and facial recognition programs) that rely on neural networks simply upload their data to cloud servers. It’s processed there and the result is sent back to the app.

                                                      Source: Medium

MIT researchers have come up with a special chip that increases the speed of neural network computations by three to seven times over it’s predecessors. More impressively, it claims to reduce power consumption by an amazing 94-95 percent. This new study will make it far more easier and practical to run NNs directly in the smartphone apps. The chip can even be embedded into other household applications like the fridge, blenders, etc.

The development of this chip was led by Avishek Biswas, an MIT graduate student in electrical engineering and computer science. In an interview with MIT, Mr. Bisvas said:

“Since these machine-learning algorithms need so many computations, this transferring back and forth of data is the dominant portion of the energy consumption. But the computation these algorithms do can be simplified to one specific operation, called the dot product. Our approach was, can we implement this dot-product functionality inside the memory so that you don’t need to transfer this data back and forth?”

Mr. Bisvas unveiled the chip this past week at the International Solid State Circuits Conference. You can read about the technology behind the neural networks on MIT’s site here.

 

Our take on this

This is a very important breakthrough because it means that the smartphones and other portable gadgets in the future can perform deep learning applications (like advanced speech and facial recognition) directly on the device, instead of using rudimentary algorithms or sending the data to the cloud and waiting for the results. We will no longer have to worry about our data going to third party apps or creating bandwidth traffic. Once this technology goes commercial, expect a lot of companies to leverage it in most electronic devices.

 

Senior Editor at Analytics Vidhya.Data visualization practitioner who loves reading and delving deeper into the data science and machine learning arts. Always looking for new ways to improve processes using ML and AI.

Responses From Readers

Clear

Congratulations, You Did It!
Well Done on Completing Your Learning Journey. Stay curious and keep exploring!

We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.

Show details