Running artificial intelligence applications on local devices comes with tons of advantages.
For one, you won’t require a network connection with to run voice assistant software such as Alexa on devices. Also, manufacturers can create privacy-friendly electronics that can store and process data locally.
There’s just one problem.
Current AI applications are not power-efficient enough to process data locally on smart devices. As a result, applications such as speech recognition and gesture recognition rely on cloud connection to work.
But that could change soon, thanks to a recent study from Centrum Wiskunde & Informatica (CWI).
The researchers have made a mathematical breakthrough that can make AI application a thousand times more power efficient. What’s more, they’ve made the underlying mathematical algorithms open source.
The researchers described their work in their yet to be peer-reviewed paper published in arXiv.org.
A Mathematical Breakthrough for Energy-Efficient AI Applications
Spiking neural networks have been around for a while.
However, they can be challenging to handle from a mathematical perspective. This, in turn, makes it difficult to put such a neural network to practice.
However, such limitation didn’t stop the researchers. In the recent mathematical breakthrough, the team developed a learning algorithm for a spiking neural network.
The algorithm offers two significant advantages over current models.
- The neurons in the network communicate less frequently
- Individual neurons execute fewer calculations
Thanks to these factors, the team was able to develop a more energy-efficient AI application.
In a statement about the project, principal investigator, Sander Bohté explained:
“The combination of these two breakthroughs make AI algorithms a thousand times more energy efficient in comparison with standard neural networks, and a factor hundred more energy efficient than current state-of-the-art neural networks.”
The breakthrough could take AI applications to the next level. For example, it becomes possible to put a more elaborate artificial intelligence in chips to enable a more extensive application.
First, new types of chips are necessary to run spiking neural networks efficiently in the real-world. Luckily, various manufacturers are already working on creating such chips.
“All kinds of companies are working hard to make this happen, like our project partner IMEC/Holst Center,” said Bohté.
Comments (0)
Most Recent