However powerful quantum computers may one day theoretically be, they are currently so error-prone that their ultimate utility is often questioned. Now, however, IBM argues that quantum computing may be entering a new era of utility sooner than expected, with its 127-qubit Eagle quantum computer possibly delivering accurate results on useful problems beyond what even supercomputers of they can face today.

Quantum computers can theoretically find answers to problems that would take classical computers eons to solve. The more components, known as quantum bits or qubits, that a quantum computer connects together, the more basic computations, known as quantum gates, it can perform, exponentially.

These methods can be applied to other more general circuits.

Kristan Temme, IBM

The key problem quantum computers face is how notoriously vulnerable they are to disruption from the slightest disturbance. Today’s cutting-edge quantum computers typically experience about one error for every 1,000 operations, and many practical applications require error rates as low as a billion times or more.

Scientists hope one day to build so-called fault-tolerant quantum computers, which can have many redundant qubits. Thus, even if some qubits fail, quantum error correction techniques can help quantum computers detect and account for these errors.

However, existing quantum computers are so-called noisy intermediate-scale quantum (NISQ) platforms. This means they are too error prone and have too few qubits to successfully perform quantum error correction techniques.

Despite the current early nature of quantum computing, previous experiments by Google and others have argued that quantum computers may have entered the era of quantum advantage, quantum primacy, or quantum supremacy over typical computers. Critics, in turn, argued that such tests only showed that quantum computers were able to outperform classical machines on contrived problems. As such, there remains a heated debate as to whether quantum computers are good enough to prove useful right now.

Now IBM reveals that its Eagle quantum processor can accurately simulate physics that ordinary computers find difficult to model beyond a certain level of complexity. Not only are these simulations of real use to researchers, the company says, but the methods they developed could be applied to other kinds of algorithms running on quantum machines today.

In the experiments, the researchers used the IBM quantum computer to model the dynamics of electron spins in a material to predict its properties, such as magnetization. This model is well understood by scientists, making it easier for researchers to validate the correctness of quantum computer results.

Importantly, our methods aren’t limited to this particular model, says study co-author Kristan Temme, a quantum physicist at IBM’s Thomas J. Watson Research Center, in Yorktown Heights, New York. These methods can be applied to other more general circuits.

This graph shows quantum computer performance versus state-of-the-art classical approximation methods versus the exact classical brute-force method for a variety of increasingly challenging computational problems.IBM

At the same time, scientists at the University of California, Berkeley ran versions of these simulations on classic supercomputers to compare quantum computer performance. They used two sets of techniques. Brute-force simulations gave the most accurate results, but they also required too much processing power to simulate large and complex systems. On the other hand, approximation methods might estimate responses for large systems, but they generally prove less and less accurate the larger a system gets.

This graph compares quantum computer performance against classical approximation methods in the regime beyond the capabilities of exact classical brute-force methods.IBM

At the largest scale examined, the quantum computer was about three times faster than classical approximation methods, finding answers in nine hours compared to 30. More importantly, the researchers found that as the models scaled up, the computer quantum corresponded to the classic brute force simulations, while the classic approximation methods have become less accurate.

What we’re doing in this work is demonstrating that we can run quantum circuits on a very large scale and get correct results, something that has always been questioned and for which many people have argued that it wouldn’t be feasible on current devices, Temme says.

When comparisons showed the quantum results didn’t agree with classical approximation methods, we initially thought the experiment had made a mistake, Temme says. It was quite a surprise to learn that the quantum computer corresponded to classic brute-force simulations rather than classic approximation methods, she adds.

Hopefully this leads to a back and forth between methods, which the quantum computer will ultimately win.

Kristan Temme, IBM

The scientists conducted tests in which they generated results from 127 qubits performing 60 passes of 2,880 quantum gates. They note that what a quantum computer can theoretically do with 68 qubits is already beyond what classic brute-force simulations are capable of calculating. While the researchers can’t prove whether the answers given by the quantum computer when using more than 68 qubits are correct, they say its success in previous runs makes them confident that it was.

IBM scientists caution that they do not claim their quantum computer is better than classical computing. Future research may soon find that ordinary computers can find correct answers for the calculations used in these experiments, they say.

Hopefully this leads to a back-and-forth between methods, which the quantum computer will ultimately win out, Temme says.

In any case, even if quantum computers don’t completely surpass classical computers, these new findings suggest that they could still prove useful for problems that ordinary computers find extraordinarily difficult. This suggests that we may now be entering a new era of utility for quantum computing, said Daro Gil, senior vice president and director of IBM Research.

IBM notes that its quantum hardware showed more stable qubits and lower error rates than before. However, the new findings hinged on what IBM calls quantum error mitigation techniques, which examine the output of a quantum computer to account for and eliminate the noise its circuits have experienced.

The quantum error mitigation strategy that IBM used in the new study, extrapolation to zero noise, repeated quantum computations at various levels of noise that the quantum processor may have experienced from its environment. This helped the researchers extrapolate what the quantum computer would have calculated in the absence of noise.

Ultimately, we want to have a fault-tolerant quantum computer. The long-term direction must be to bridge these results to a point where we can use quantum error correction.

Kristan Temme, IBM

Both our hardware and our error mitigation methods are now at the level where they can be used to start implementing the vast majority of all short-term algorithms that have been proposed in the last 5-10 years, to see which algorithm it actually provides a quantum advantage in practice, Temme says.

One drawback of zero-noise extrapolation is that it requires a quantum computer to run its circuits multiple times. For the zero-noise extrapolation method we used here, we need to run the same experiment at three different noise levels, Temme says. This is a cost that must be paid for every data point in the computation, i.e. every time we use the processor.

IBM notes that these new discoveries represent the first results on mitigating quantum errors at this scale. We think there is still some room for improvement in these methods, Temme says. Future research may also test whether quantum error mitigation can be broadly applied, as the company hopes, to other types of quantum applications beyond this model, he adds.

IBM says its quantum computers running both in the cloud and on-premise at partner locations in Japan, Germany and the United States will be powered by as few as 127 qubits over the next year.

Eventually, we’ll want to have a fault-tolerant quantum computer, says Temme. The long-term direction must be to bridge these results to a point where we can use quantum error correction. We expect this to drive hardware development, where every improvement in terms of components now translates into more complex calculations that can be performed, leading to a smoother transition to a fault-tolerant device.

The scientists detailed their findings June 14 in the journal *Nature*.

From articles on your site

Related articles Around the web

#IBM #Quantum #computers #heavy #lifting

Image Source : spectrum.ieee.org