Jeroen van den Brink and his colleagues at Leiden University in the Netherlands, however, suggest that even perfect isolation would not keep decoherence at bay. A process called spontaneous symmetry breaking will ruin the delicate state required for quantum computing. In the case of one proposed device based on superconducting quantum bits (qubits), they predict that this new source of decoherence would degrade the qubits after just a few seconds.
A key feature of qubits is their ability to be in a so-called superposition; in essence, they can be 0 and 1 simultaneously, unlike bits in a standard computer, which must have a definite value. A qubit in a superposition is typically in a highly symmetrical state. For example, in a superconducting qubit a small electric current circulates in a loop both clockwise and counterclockwise at the same time. Spontaneous symmetry breaking disturbs that equanimity. The process occurs throughout physics--a ball perched on the top of a hill, for instance, tends to roll down one side or the other, ruining the symmetrical (if unstable) state of the ball balanced at the top. In the case of the superconducting loop, spontaneous symmetry breaking tends to cause the qubit to choose a definite state, ruining the superposition.
The Leiden researchers' result applies only to qubits that are composed of a large number of particles. Superconducting qubits fit that bill, because the electric current consists of many billions of electrons. The result does not apply to qubits based on single particles, such as an ion suspended in a magnetic trap or a single electron in a quantum dot on a chip. Indeed, in August physicists at the National Institute of Standards and Technology demonstrated single-ion qubits with a coherence time of more than 10 seconds.
Not everyone agrees that the constraint of a few seconds is a serious obstacle for superconducting qubits. John Martinis of the University of California at Santa Barbara says that one second "is fine for us experimentalists, since I think other physics will limit us well before this timescale." According to theorist Steven M. Girvin of Yale University, "if we could get a coherence time of one second for a superconducting qubit, that would mean that decoherence would probably not be a limitation at all." That is because quantum error correction can overcome decoherence once the coherence time is long enough, Girvin argues. By running on batches of qubits that each last for only a second, a quantum computer as a whole could continue working indefinitely.
So far superconducting qubits in the laboratory last about 500 nanoseconds before decoherence takes its toll. Girvin points out that decoherence times were just nanoseconds a few years ago, so that 500 nanoseconds "represents tremendous progress."