At the crossroads between classical and quantum physics, “small quantum computers” might be the missing link between conventional and quantum computing.

You can say the quantum computing age has officially started when a quantum machine solves a problem that current supercomputers can’t. Although researchers are close, we’re not there yet.

Scientists think that we can aspire to quantum superiority by the 50 qubits threshold, and a couple of major tech companies are inching their way toward that barrier and beyond.

The current most advanced quantum processors pack a few dozens of qubits at most, like IBM Q (50 qubits) and Google’s Bristlestone (72-qubits), but they’re all still at the development stage and have yet to prove their computing capabilities.

Curiously, the Canadian startup D-Wave System Inc. claims its latest commercial machine (D-Wave 2000Q) runs on 2,000 qubits, but experts say its technology doesn’t allow for true quantum computing.

A study has shown D-Waves processors were no faster than conventional computers, though that was in 2014.

The truth is that D-Wave processors are pretty darn fast, but only for a very limited range of problems.

We’d like to see how D-Wave’s “most advanced quantum computer in the world” would fare against the IBM Summit supercomputer for example, which is capable of performing 200 quadrillion calculations per second.

Read More: US Ends China’s Supercomputer Reign With IBM Summit System

Until all or any of the ongoing attempts at bringing quantum computing for large-scale and universal uses pan out, some engineers are looking to find a compromise between quantum and classical physics.

A team of physicists from two quantum research centers – the Joint Quantum Institute (JQI) and the Joint Center for Quantum Information and Computer Science (QuICS) – are investigating the potential of “small quantum computers” and the range of their applications (read: calculations they can make).

The researchers base their experiments on what’s called “sample complexity”, or “how easy or hard it is for an ordinary computer to simulate the outcome of a quantum experiment… If an ordinary computer takes a reasonable amount of time to mimic one run of a quantum experiment—by producing samples with approximately the same probabilities as the real thing—the sampling complexity is low; if it takes a long time, the sampling complexity is high.”

Read More: Photonic Quantum Computing may be Closer Than Previously Thought

We don’t know for sure how powerful early quantum devices will be. The investigation of the crossover point between low and high complexity would provide clues about this question.

“Sampling complexity has remained an underappreciated tool, largely because small quantum devices have only recently become reliable. These devices are now essentially doing quantum sampling, and simulating this is at the heart of our entire field… A deeper look into the use of sampling complexity theory from computer science to study quantum many-body physics is bound to teach us something new and exciting about both fields,” said Alexey Gorshkov, a co-author of the new paper.

Do hybrid conventional-quantum systems seem like the logical next step before we get to true universal quantum computers?

banner ad to seo services page