Built-in AI chips for use in mobile and smart home devices

In today’s era, 90% of Artificial Intelligence runs on smartphone. We can also say that a smartphone is the world’s most popular AI device. Today, smartphones that use Machine Learning depend on cloud servers, which limit how information is processed. Machine learning in smartphones is now shifting its gears from cloud based to on-device. Due to cloud servers, machine learning on smartphones require massive amount of storage and computing power. Many deep learning algorithms reside in cloud must be enable by some toolkits e.g. CNN, RNN. But the cloud server based AI approach will not work for mobile applications which require latency. To perform sophisticated tasks, one method the companies are coming with is to use distributed Machine Learning.

ARM has designed first dedicated ML chip which are meant for use in mobile and smart home devices. In this chip, ARM has added the hardware for machine learning, that helps the device to run AI algorithms too. ARM has also joined hands with Qualcomm to develop the chips by early 2019.

The most important advantage of on device ML chip is 24x7connectivity is not required anymore. Algorithms can run locally on device even when the device is not connected to internet. The other advantages are 1) Increase in processing speed 2) Less power consumption 3) More efficiency.

The challenges that we can face while developing this chip are 1) limited access to training data 2) network congestion, which we must solve to bring capabilities of on-device machine learning to consumers.

Thus, on device machine learning chip which we can see by 2019 has enormous new capabilities which are going to revolutionize the consumer market particularly for smart phones and smart home devices.

 

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics