exclusive content

How to Keep Smartphones Cool When They’re Running Machine Learning Models

Researchers from the University of Austin and Carnegie Mellon have proposed a new way to run computationally expensive machine learning models on mobile devices such as smartphones, and on lower-powered edge devices, without triggering thermal throttling – a common protective mechanism in professional and consumer devices, designed to lower the temperature of the host device by slowing down its performance, until acceptable operating temperatures are obtained again.

The new approach could help more complex ML models to run inference and various other types of task without threating the stability of, for instance, the host smartphone.

The central idea is to use dynamic networks, where the weights of a model can be accessed by both a ‘low pressure’ and ‘full intensity’ version of the local machine learning model.

In cases where the operation of the local installation of a machine learning model should cause the temperature of the device to rise critically, the model would dynamically switch to a less demanding model until the temperature is stabilized, and then switch back to the full-fledged version.

Complete article can be read here

share:

Facebook
Twitter
LinkedIn

more posts

send us a message