Even if experiments cannot yet tackle the measurement problem fully, they have much to contribute to a very hot field: quantum computing. A classical computer is built of transistors that switch between 0 or 1. In a quantum computer, however, the “transistors” remain in a superposition of 0 and 1 (called a quantum bit, or qubit); calculations proceed via interactions between superposed states until a measurement is performed. Then the superpositions collapse, and the machine delivers a final result. In theory, because it could process many possible answers simultaneously, a quantum computer would accomplish in seconds tasks, such as factoring large numbers to break codes, that would take years for a classical machine.
In December 1995 researchers successfully created quantum two-bit systems. Monroe and his colleagues crafted a logic element called a controlled- NOT gate out of a beryllium ion. The ion is trapped and cooled to its lowest vibrational state. This state and the first excited vibrational state constitute one bit. The second bit is the spin of one of the ion’s electrons. Laser pulses can force the bits into superpositions and flip the second bit depending on the state of the first bit. Other variations of gates couple two photons via an atom in a cavity or transmit an entangled pair of photons through a network of detectors.
Yet the creation of a useful quantum computer, relying on superpositions of thousands of ions performing billions of operations, remains dubious. The problem? Loss of superposition. The logic gates must be fast enough to work before the qubits lose coherence. Using data from the NIST gate experiment, Haroche and Raimond calculated in an August 1996 Physics Today article that given the gate speed of 0.1 millisecond, the bits would have to remain in a superposition for at least a year to complete a meaningful computation (in this case, factoring a 200-digit number).
Other physicists are less pessimistic, since error-correcting codes (which are indispensable in classical computing) might be the solution. “It gives you instructions on how to repair the damage,” says David DiVincenzo of the IBM Thomas J. Watson Research Center in Yorktown Heights, N.Y.
Moreover, DiVincenzo points out that a new method of quantum computation, making use of nuclear magnetic resonance (NMR) techniques, could raise coherence times to a second or more. Say a liquid—a cup of coffee—is placed in a magnetic field; because of thermal vibration and other forces, only one out of every million nuclei in the caffeine molecules would line up with the magnetic field. These standouts can be manipulated with radio waves to put their spins in a superposition of up and down. Maintaining coherence is easier here than in the other techniques because the nuclear spins undergoing the superpositions are well protected from the environment by the surrounding turmoil of seething molecules, the mad scramble of which averages out to zero. The calculating caffeine sits effectively in the calm eye of a hurricane. Two groups have recently demonstrated quantum computing by NMR, using a four-qubit version to sum 1 and 1. More complex systems, using perhaps 10 qubits, could be had by the end of the year.
The drawback is readout. With no way to detect individual spins, researchers must measure all the molecules’ spins— both qubit and nonqubit ones. Complex molecules capable of sustaining many spins are therefore “noisier” than simpler ones. “They’ll be able to do some nice stuff,” Monroe says, “but beyond about 10 bits, they’ll run into fundamental problems.” The output from 10 bits is only 0.001 as strong as that from a single bit; for 20, the output is down by one million. So the NMR technique may not enter a meaningful computational realm of at least 50 bits.