From Simons Science News (find original story here).

Someday, quantum computers may be able to solve complex optimization problems, quickly mine huge data sets, simulate the kind of physics experiments that currently require billion-dollar particle accelerators, and accomplish many other tasks beyond the scope of present-day computers. That is, if they are ever built. But even as daunting technical challenges keep the dream at bay, theorists are increasingly putting the ideas and techniques of quantum computing to work solving deep, long-standing problems in classical computer science, mathematics and cryptography.

“There are quite vigorous debates about whether quantum computers will ever actually be built,” said Chris Peikert, a cryptographer and computer scientist at Georgia Institute of Technology. “But that’s a separate question from whether quantum techniques or quantum algorithms can help you solve problems in new ways.”

In recent years, quantum ideas have helped researchers prove the security of promising data encryption schemes called lattice-based cryptosystems, some applications of which can shroud users’ sensitive information, such as DNA, even from the companies that process it. A quantum computing proof also led to a formula for the minimum length of error-correcting codes, which are safeguards against data corruption.

Quantum ideas have also inspired a number of important theoretical results, such as a refutation of an old, erroneous algorithm that claimed to efficiently solve the famously difficult traveling salesman problem, which asks how to find the fastest route through multiple cities.

“If it only happened once it would be a coincidence, but there are so many instances when we ‘think quantumly’ and come up with a proof,” said Oded Regev, a computer scientist at New York University.

This recurring theme has led some researchers to argue that quantum computing is not an esoteric subfield of computer science, but rather a generalization of classical computing, in much the same way that polygons are a generalization of triangles. Just as polygons can have any number of sides while triangles only have three, quantum computers can perform operations represented by any numbers (positive or negative, real or imaginary), while operations on classical computers use only nonnegative real numbers.

As the more general case, quantum ideas are a powerful tool in developing more specific classical computing proofs. “There are a number of classical problems that have nothing to do with quantum, but that are most easily analyzed by generalizing to the quantum level, proving something using quantum information theory, and scaling back the result to the classical level,” said Ronald de Wolf, a theoretical computer scientist at the Dutch Centre for Mathematics and Computer Science.

Currently, it is estimated that fewer than 5 percent of theoretical computer scientists study quantum computing. But researchers say that recent success from “thinking quantumly” has led a growing number of theorists to brush up on their physics. “These very striking spinoffs of quantum computing have actually drawn classical computer scientists into learning something about quantum computing,” said Scott Aaronson, a theoretical computer scientist at the Massachusetts Institute of Technology.

The goal of quantum computing is to harness the peculiar behavior of particles at the quantum scale in order to perform calculations that aren’t believed to be feasible with conventional computers. An ordinary computer stores “bits” of information in transistors, which, like switches, can be configured in one of two states, represented by “1” or “0.” A quantum computer stores “qubits” of information in subatomic particles, such as electrons or photons, which can exist in state 1 or 0, or in a superposition of both states, and which can become entangled with one another, so that the state of one qubit decides the state of another.

Superposition and entanglement cause qubits to behave very differently from bits. Whereas a two-bit circuit in a conventional computer can be in only one of four possible states (0 and 0, 0 and 1, 1 and 0, or 1 and 1), a pair of qubits can be in a combination of all four. As the number of qubits in the circuit increases, the number of possible states, and thus the amount of information contained in the system, increases exponentially. A quantum computer with just a few hundred qubits would be able to solve certain problems more quickly than today’s supercomputers.

The only problem is that no one has managed to construct a quantum circuit with more qubits than you can count on both hands. Chris Lirakis, a physicist in the superconducting quantum computation group at IBM Research, explained that in order to keep the delicate entanglement of a system of qubits from collapsing, the system must be isolated and cooled to a temperature near absolute zero. At the same time, the qubits must be spaced about a centimeter apart to prevent an operation performed on one qubit from altering the states of neighboring ones. This challenge would make a thousand-qubit system far too large to fit into the kinds of refrigerators that can achieve such extreme cooling.

“There are a lot of really serious engineering challenges that need to be brought to bear in order to make the system scalable,” Lirakis said. “It’s this tug-of-war between all these different issues.”

Regev, who worked with Peikert in using quantum ideas to prove the security of lattice-based cryptosystems, says he hopes quantum computers will be built in his lifetime so he can see them in action. “But quantum has made such a great impact that even if quantum computers are never built, I wouldn’t care too much,” he said.

As quantum techniques become more popular among computer scientists, they will likely produce more classical results. “It’s these results that convince me that even if the universe had not been quantum mechanical,” Aaronson said, “eventually computer scientists would have invented quantum computing as a proof tool.”

Reprinted with permission from Simons Science News, an editorially independent division of whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the computational, physical and life sciences.