Technology 3 min read

New Research Shows the Potential Power of Quantum Computers

The long lauded promises of the quantum computing future have finally been confirmed thanks to new research. | Image By Xenia Design | Shutterstock

The long lauded promises of the quantum computing future have finally been confirmed thanks to new research. | Image By Xenia Design | Shutterstock

Scientists have finally provided conclusive proof of the supremacy of quantum machines over conventional computers.

The IBM AC922 Summit has dethroned the Chinese Sunway and now holds the title of the world’s fastest and smartest supercomputer.

Occupying an area as large as two tennis courts at Oak Ridge National Laboratory in Tennessee, Summit’s peak speed clocks in at 200 petaFLOPS, or 200,000 trillion calculations per second.

Even with this unfathomable speed guaranteed by thousands of computer servers and AI software, Summit wouldn’t even come close to a true quantum computer powered by a single small processor consuming a fraction of the power.

However, while the potential of quantum computers is immense compared to their silicon-based computing systems, it all still remains theoretical.

Now, for the first time, researchers have found proof that quantum computers have a definitive advantage over their classical physics-based counterparts.

Quantum Supremacy: Proof at Last

Three researchers from the Technical University of Munich (TUM), University of Waterloo and IBM say they have now “demonstrated for the first time that quantum computers do indeed offer advantages over conventional computers.”

Robert König, David Gosset, and Sergey Bravyi collaborated and designed a quantum circuit that could solve a “difficult” algebraic problem.“Our result shows that quantum information processing really does provide benefits – without having to rely on unproven complexity-theoretic conjectures,” says Robert König, professor for the theory of complex quantum systems at TUM.

That doesn’t mean that conventional computing wouldn’t be able to solve the same math problem, but it would require a lot more resources to perform the same task.

In addition, the new circuit the team built also answers the question of why quantum algorithms would outperform classical computers, which comes down to the non-locality principle in quantum mechanics.

Read More: New Test Opens Window Into World of Quantum Computing

In a blog post, IBM cites the example of Shor’s algorithm that’s “almost exponentially faster than any known method on a classical computer”, and points to the issue of the circuit depth.

“… some people are getting concerned that we may be able to break prime-factor-based encryption like RSA much faster on a quantum computer than the thousands of years it would take using known classical methods. However, people skip several elements of the fine print: Scientists prove there are certain problems that require only a fixed circuit depth when done on a quantum computer, no matter how the number of inputs increases. On a classical computer, these same problems require the circuit depth to grow larger as inputs increase.’

Maybe we’ll get the true and definitive proof of quantum supremacy when we build a universal quantum computer and pit it against a classical computer in a power showdown.

Say quantum computers aren’t faster than classical ones, what other advantages make them worth pursuing?

First AI Web Content Optimization Platform Just for Writers

Found this article interesting?

Let Zayan Guedim know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.

Profile Image

Zayan Guedim

Trilingual poet, investigative journalist, and novelist. Zed loves tackling the big existential questions and all-things quantum.

Comments (0)
Least Recent least recent
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.