For 30 years, researchers have pursued the universal quantum computer, a device that could solve any computational problem, with varying degrees of success. Now, a team in California and Spain has made an experimental prototype of such a device that can solve a wide range of problems in fields such as chemistry and physics, and has the potential to be scaled up to larger systems.
Both IBM and a Canadian company called D-Wave have created functioning quantum computers using different approaches. But their devices are not easily scalable to the many quantum bits (qubits) needed for solving problems that classical computers cannot.
Computer scientists at Google’s research laboratories in Santa Barbara, California, and physicists at the University of California at Santa Barbara and the University of the Basque Country in Bilbao, Spain, describe their new device online in Nature.
“It’s terrific work in many respects, and is filled with valuable lessons for the quantum computing community,” says Daniel Lidar, a quantum-computing expert at the University of Southern California in Los Angeles.
The Google prototype combines the two main approaches to quantum computing. One approach constructs the computer’s digital circuits using qubits in particular arrangements geared to solve a specific problem. This is analogous to a tailor-made digital circuit in a conventional microprocessor made from classical bits.
Much of quantum computing theory is based on this approach, which includes methods for correcting errors that might otherwise derail a calculation. So far, practical implementations have been possible only with a handful of qubits.
The other approach is called adiabatic quantum computing (AQC). Here, the computer encodes a given problem in the states of a group of qubits, gradually evolving and adjusting the interactions between them to “shape” their collective quantum state and reach a solution. In principle, just about any problem can be encoded into the same group of qubits.
This analog approach is limited by the effects of random noise, which introduces errors that cannot be corrected as systematically as in digital circuits. And there’s no guarantee that this method can solve every problem efficiently, says computer scientist Rami Barends, a member of the Google team.
Yet only AQC has furnished the first commercial devices — made by D-Wave in Burnaby, British Columbia — which sell for about $15 million apiece. Google owns a D-Wave device, but Barends and colleagues think that there’s a better way to do AQC.
In particular, they want to find some way to implement error correction. Without it, scaling up AQC will be difficult, because errors accumulate more quickly in larger systems. The team thinks the first step to achieving that is to combine the AQC method with the digital approach’s error-correction capabilities.
To do that, the Google team uses a row of nine solid-state qubits, fashioned from cross-shaped films of aluminium about 400 micrometers from tip to tip. These are deposited onto a sapphire surface. The researchers cool the aluminium to 0.02 degrees kelvin, turning the metal into a superconductor with no electrical resistance. Information can then be encoded into the qubits in their superconducting state.
The interactions between neighboring qubits are controlled by ‘logic gates’ that steer the qubits digitally into a state that encodes the solution to a problem. As a demonstration, the researchers instructed their array to simulate a row of magnetic atoms with coupled spin states — a problem thoroughly explored in condensed-matter physics. They could then look at the qubits to determine the lowest-energy collective state of the spins that the atoms represented.
This is a fairly simple problem for a classical computer to solve. But the new Google device can also handle so-called ‘non-stoquastic’ problems, which classical computers cannot. These include simulations of the interactions between many electrons, which are needed for accurate computer simulations in chemistry. The ability to simulate molecules and materials at the quantum level could be one of the most valuable applications of quantum computing.
This new approach should enable a computer with quantum error correction, says Lidar. Although the researchers did not demonstrate that here, the team has previously shown how that might be achieved on its nine-qubit device.
“With error correction, our approach becomes a general-purpose algorithm that is, in principle, scalable to an arbitrarily large quantum computer,” says Alireza Shabani, another member of the Google team.
The Google device is still very much a prototype. But Lidar says that in a couple of years, devices with more than 40 qubits could become a reality.
“At that point,” he says, “it will become possible to simulate quantum dynamics that is inaccessible on classical hardware, which will mark the advent of ‘quantum supremacy’.”
This article is reproduced with permission and was first published on June 8, 2016.