Technology 2 min read

Researchers Create new Algorithm to Expedite Processing of Big Data

Researchers have created a new algorithm which could have the potential of making it easier than ever to process large amounts of data.

A new algorithm could be the key to processing large amounts of data at exponentially higher speeds. | Image By asharkyu | Shutterstock

A new algorithm could be the key to processing large amounts of data at exponentially higher speeds. | Image By asharkyu | Shutterstock

Scientists have reportedly created a new algorithm that will enable the quick and easy access to Big Data.

In a joint study published in the journal IEEE Transactions on Network and Service Management, researchers from the Samara University and the University of Missouri described a new algorithm that could provide faster and more reliable access to Big Data processing centers.

According to the Russian and American computer scientists who worked on the project, their algorithm utilizes a unique routing method to work. This technique allows for fast access to the world’s largest and most powerful data centers, providing a more efficient way of solving high-precision calculations.

“We offer the mechanism that can be in demand by the scientists who conduct experiments on the basis of the Large Hadron Collider at Cern,” Andrey Sukhov, one of the authors of the study and a professor at the Department of Supercomputers and General Informatics of Samara University, said.

“They calculate tasks in the laboratories scattered all over the world, make inquiries to the computer centers of the Cern. They also need to exchange both textual information and high-resolution streaming video online. The technology that we offer will help them with this.”

In their study, the researchers explained that the algorithm would only succeed at its performance if the four requirements have been met. The four includes the bandwidth signal, data transmission speed in Kbps, cloud service price, and the cloud storage.

The algorithm allegedly finds the superior constrained shortest path for the desired quality and data transmission speed to be achieved. At its peak, the speed of data transmission can be increased by up to 50 percent.

The new algorithm was called “The Neighborhoods Method” and allows frameworks to share their load and pathways with other networks. The algorithm’s functionality also reportedly stays the same regardless of what kind of Internet connection is being used.

Sukhov and his colleagues said that they are planning to use the algorithm on their own projects. He claimed that other computer scientists who are interested in using the algorithm to calculate data about combustion reactions have also approached them.

Where else do you think this new algorithm could find better applications?

Found this article interesting?

Let Rechelle Ann Fuertes know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.


Profile Image

Rechelle Ann Fuertes

Rechelle is an SEO content producer, technical writer, researcher, social media manager, and visual artist. She enjoys traveling and spending time anywhere near the sea with family and friends.

Comments (0)
Most Recent most recent
You
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.