Science 4 min read

How "Hashing" Cuts Deep Learning Computations, Energy up to 95%

Peggy's Cove in Halifax, Nova Scotia | Site of the KDD 2017 Conference |

Peggy's Cove in Halifax, Nova Scotia | Site of the KDD 2017 Conference |

Rice University researchers applied hashing to artificial intelligence deep learning neural networks and reduced computational workload by 95%.

A new data-indexing technique could bring a breath of fresh air to the field of deep learning neural networks. Here’s how “Hashing” could help lead us to the next big AI breakthrough.

It seems that AI news has been crossing my desk a lot lately, and that’s a pretty good sign for the near future because it means that we may be on the verge of the next great leap in the capability of artificial intelligence software.

Much of the news has been related to data, whether it’s storage or active memory. In the last month alone we’ve seen more information on why data storage is so crucial for AI development, and even how some hardware could help bridge the gap between stored and active memory.

Well, now we have another data-related breakthrough, and while it is literally a rehash of an old idea, it could be a very new solution that eliminates the workload of more than 95% of computations in a deep learning neural network.

Hashing can reduce an AI's workload by 95%.Click To Tweet

That’s right, I said 95%. Now that I have your attention, let’s talk about the technique known as hashing, and what it can do for modern AI software.


Hashing is a widely used technique for indexing data. Think of it as a sort of Dewey Decimal system for the library of data that can be found within a computer system, or more specifically, a deep learning neural network.

Reading the whole library and finding the one bit of relevant data? That’s a snap for an AI program, but it requires a lot of computational power. Give it an index, however, and your humble little program will be able to guide itself to the correct bit of data without having to expend the effort in searching through the whole library.

Researchers Ryan Spring and Anshumanli Shrivastava over at Rice University have adapted hashing for deep learning neural networks, and the results have been pretty amazing thus far, 95% reduced workload amazing, even.

At least, that’s what Spring tells us: “For example, in small-scale tests we found we could reduce computation by as much as 95 percent and still be within 1 percent of the accuracy obtained with standard approaches.”

We’ll know more after the KDD 2017 conference in Halifax, Nova Scotia during August, but we do have one other little tidbit of information about what the technique can do.

Apparently, the larger the system, the more effective the technique is, boosting it to 98% and beyond!

Taking 95% or more of the computational power out of the data retrieval process could supercharge what AI are capable of, and how much they can process at the same time, and that’s where the big leap I mentioned earlier could happen.

A Giant Leap Forward for AI

We’re expecting a lot from the future of AI software, up to but not including the creation of Skynet.

But we do want our bots to drive for us, and that is proving an incredibly difficult task. We may have seen our digital assistants getting more and more advanced in recent years, but that kind of language recognition is nowhere near as difficult to pull off as safely navigating a giant metal death machine (read: car) through a residential neighborhood without running over someone’s cat.

If the computational needs for image recognition, analysis, and judgment are lowered, then perhaps they can be layered, and AI can start making simultaneous judgments on things like we do when we drive our cars.

But it isn’t just about the self-driving vehicle. Any complex process that you’ve dreamed about delegating to an AI program may soon be at your fingertips; if not because of hashing, then because of so many other simultaneous advancements such as memristor chips, or something coming out of the growing Industrial Internet of Things.

As always, we’ll be keeping our finger on the pulse so we can tell you all about it when the next big AI breakthrough happens.

First AI Web Content Optimization Platform Just for Writers

Found this article interesting?

Let William McKinney know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.

Profile Image

William McKinney

William is an English teacher, a card carrying nerd, And he may run for president in 2020. #truefact #voteforedgy

Comments (0)
Most Recent most recent
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.