Free Shipping on orders over US$39.99 How to make these links

Ordinary computers can beat Google’s quantum computer after all | Science


When the era of quantum computing dawned 3 years ago, its sunrise may have been hiding behind a cloud. In 2019, Google researchers claimed they had passed a milestone known as quantum supremacy when their quantum computer Sycamore performed in 200 seconds an abstruse calculation that they said would tie a supercomputer to 10,000 years. Today, Chinese scientists were able to compute in a few hours using ordinary processors. A supercomputer, they say, could beat Sycamore.

“I think they’re right that if they had access to a big enough supercomputer, they could simulate the … task in a second,” said Scott Aaronson, a computer scientist at the University of Texas , Austin. The development needs a little light on the acquisition of Google, said Greg Kuperberg, a mathematician at the University of California, Davis. “Getting 300 feet from the summit is less exciting than going to the summit.”

Still, the promise of quantum computing remains intact, say Kuperberg and others. And Sergio Boixo, chief scientist for Google Quantum AI, said in an email that the Google team knew that its content would not last long. “In our 2019 paper, we said that classical algorithms will improve,” he said. However, “we do not think that this classical method will be able to follow the quantum circuits of 2022 and beyond.”

The “problem” that Sycamore solves is designed to be difficult for a conventional computer but as easy as possible for a quantum computer, which manipulates qubits that can be set to 0, 1, or—thanks to quantum mechanics -any combination of 0 and 1 at the same time. Together, Sycamore’s 53 qubits, tiny resonating electrical circuits made of superconducting metal, can encode any number from 0 to 253 (almost 9 quadrillion)—or even all of them at once.

Starting with all qubits set to 0, Google researchers applied a qubit and matched a random but fixed set of logical operations, or gates, over 20 cycles, then read the qubits. Roughly speaking, quantum waves representing all possible outputs fall between the qubits, and the gates create interference that reinforces some outputs and cancels others. So some must appear to have greater probability than others. Over millions of trials, a spiky output pattern emerged.

Google researchers argue that simulating the effects of interference can beat even Summit, a supercomputer at Oak Ridge National Laboratory, with 9,216 central processing units and 27,648 faster processing units. graphics processing (GPU). Researchers with IBM, which developed Summit, quickly countered that if they took advantage of every small hard drive available in the computer, the calculation could be handled in a few days. Now, Pan Zhang, a statistical physicist at the Institute of Theoretical Physics of the Chinese Academy of Sciences, and his colleagues show how to overcome the Sycamore in a paper in the press of Physical Review Letters.

Following others, Zhang and colleagues recast the problem as a 3D mathematical array called a tensor network. It consists of 20 layers, one for each cycle of gates, with each layer containing 53 dots, one for each qubit. Lines connect dots to represent gates, with each gate encoded by a tensor—a 2D or 4D grid of complex numbers. Running the simulation is then reduced to, essentially, multiplying all the tensors. “The advantage of the tensor network approach is that we can use multiple GPUs to perform computations in parallel,” Zhang said.

Zhang and colleagues also rely on an important insight: Sycamore’s calculation is far from exact, so theirs need not be either. Sycamore calculates the distribution of outputs with an accuracy of approximately 0.2% – just enough to distinguish the spikiness of the fingerprint from the noise of the circuitry. So Zhang’s group traded accuracy for speed by cutting some lines in its network and removing the corresponding gateways. Losing just eight lines makes the computation 256 times faster while maintaining an accuracy of 0.37%.

The researchers calculated the output pattern for 1 million of the 9 quadrillion possible number strings, relying on a variation of their own to obtain a truly random, representative set. The computation took 15 hours on 512 GPUs and resulted in remarkably spiky output. “It’s fair to say that the Google experiment was simulated on a conventional computer,” said Dominik Hangleiter, a quantum computer scientist at the University of Maryland, College Park. On a supercomputer, the computation takes a few dozen seconds, Zhang said—10 billion times faster than the Google team estimated.

The development highlights the pitfalls of racing a quantum computer against a conventional one, the researchers said. “There is an urgent need for better quantum supremacy experiments,” Aaronson said. Zhang suggested a more practical approach: “We need to find some real-world applications to demonstrate the value advantage.”

However, Google’s demonstration was not just hype, the researchers said. Sycamore requires fewer operations and less power than a supercomputer, Zhang said. And if Sycamore had a little higher loyalty, he says, his team’s simulation wouldn’t have been able to continue. As Hangleiter said, “The Google experiment did what it needed to do, start this race.”



Source link

We will be happy to hear your thoughts

Leave a reply

Info Bbea
Logo
Enable registration in settings - general
Compare items
  • Total (0)
Compare
0
Shopping cart