Technology 3 min read

Deep Neural Network to run on Smartphones Soon

Zapp2Photo / Shutterstock.com

Zapp2Photo / Shutterstock.com

Researchers reduced the size of a deep neural network, allowing it to be installed on smartphones and other smaller gadgets.

Although a deep neural network may sound like a vague “techy” thing, we already use it in our daily lives.

It’s why your Gmail‘s spam filter is so effective. Also, Spotify uses the same artificial intelligence to choose songs for your “Discover” weekly playlist.

The deep neural network finds the correct mathematical manipulation to turn input into an output from a linear or a non-linear relationship. In other words, AI is good at recognizing and classifying data.

There’s just one thing.

DNN usually requires a lot of computing power, as well as memory to run. While that may not be a problem for high-end laptops, an average smartphone could never handle the task.

But that’s about to change. A team of researchers at Northeastern have demonstrated a way to run deep neural networks on a smartphone or a similar device.

In a statement about the project, an assistant professor of electrical and computer engineering at Northeastern, Yanzhi Wang said:

“It is difficult for people to achieve the real-time execution of neural networks on a smartphone or these kinds of mobile devices. But we can make the most deep learning applications work in real-time.”

Wang and his colleagues described how they accomplished the feat here.

Getting Deep Neural Network to Run on Mobile Devices

Currently, mobile devices need to be connected to the internet to access a deep neural network. The phone collects data and sends it to a remote server for processing.

That explains why Siri only responds when you’re connected to the internet. Wang and his colleagues have a solution.

In addition to generating a code to run the neural network model efficiently, the Northeastern team has devised a way to reduce its size too.

That way, the network can execute tasks 56 times faster than past demonstrations while maintaining optimum accuracy. This could result in the implementation of DNN in off-the-shelf devices that may not be connected to the internet.

Expectedly, the technology’s potential application extends beyond having an offline Siri.

Wang noted:

“There are so many things that need intelligence. Medical devices, wearable devices, sensors, smart cameras. All of these, they need something enhancing recognition, segmentation, tracking, surveillance, and so many things, but currently, they’re limited.”

The researchers also point out that deep neural networks raise privacy concerns.

Currently, devices collect personal information and send them to the cloud for processing. Meanwhile, the Northwestern team’s method would enable users to process their data locally.

Previously, people believed that deep learning needed dedicated chips, or could only be run on servers over the cloud,” Wang says. “This kind of assumption of knowledge limits the application of deep learning. We cannot always rely on the cloud. We need to make local, smart decisions.”

Read More: Tesla Deep Learning: How to Create the Perfect Autonomous Car

First AI Web Content Optimization Platform Just for Writers

Found this article interesting?

Let Sumbo Bello know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.


Profile Image

Sumbo Bello

Sumbo Bello is a creative writer who enjoys creating data-driven content for news sites. In his spare time, he plays basketball and listens to Coldplay.

Comments (0)
Most Recent most recent
You
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.