Flawlessly accounting for the behavior of matter on scales from the subatomic to the astronomical, quantum mechanics is the most successful theory in all the physical sciences. It is also the weirdest.
In the quantum realm, particles seem to be in two places at once, information appears to travel faster than the speed of light, and cats can be dead and alive at the same time. Physicists have grappled with the quantum world's apparent paradoxes for nine decades, with little to show for their struggles. Unlike evolution and cosmology, whose truths have been incorporated into the general intellectual landscape, quantum theory is still considered (even by many physicists) to be a bizarre anomaly, a powerful recipe book for building gadgets but good for little else. The deep confusion about the meaning of quantum theory will continue to add fuel to the perception that the profound things it is so urgently trying to tell us about our world are irrelevant to everyday life and too weird to matter.
In 2001 a team of researchers began to develop a model that either eliminates the quantum paradoxes or puts them in a less troubling form. The model, known as Quantum Bayesianism, or QBism for short, reimagines the entity that lies at the heart of quantum weirdness—the wave function.
In the conventional view of quantum theory, all the properties of any isolated system, such as an atom, are encapsulated mathematically by the system's wave function. If you want to predict how likely it is that an electron in that atom will appear at a certain spot, for example, you can calculate that probability from its wave function. This mathematical construct is a tremendously useful tool for both theoretical and experimental physics. But problems arise when physicists assume that a wave function is real.
QBism, which combines quantum theory with probability theory, maintains that the wave function has no objective reality. Instead QBism portrays the wave function as a mathematical guidebook. An observer can use it to anticipate how things behave in the quantum world.
Specifically, the observer employs the wave function to assign his or her belief that properties of a quantum system will have particular values, realizing that one's own actions affect the system and change those properties in inherently uncertain ways. Another observer, using a wave function that describes the world as that person sees it, may come to a completely different conclusion about the same quantum system. One system—one event—can have as many different wave functions as there are observers. After they have communicated with one another and modified their private wave functions to account for the newly acquired knowledge, a coherent worldview emerges.
Seen this way, the wave function “may well be the most powerful abstraction we have ever found,” says theoretical physicist N. David Mermin of Cornell University, a recent convert to QBism.
The Unreal Quantum
The notion that the wave function isn't real dates back to the 1930s and the writings of Niels Bohr, one of the founding fathers of quantum mechanics. He considered it part of quantum theory's “purely symbolic” formalism—a computational tool, no more. QBism is the first model to give mathematical backbone to Bohr's assertion. It melds quantum theory with Bayesian statistics, a 200-year-old discipline that defines “probability” as something like “subjective belief.” Bayesian statistics also gives formal mathematical rules for how to update one's subjective beliefs in light of new information. By interpreting the wave function as a subjective belief and subject to revision by the rules of Bayesian statistics, the mysterious paradoxes of quantum mechanics vanish, QBism's proponents say.
Consider again an electron in an atom. We set up an experiment to detect the particle, and we find it in one particular location. But as soon as we stop looking, the electron's wave function spreads out. That seems to imply that the electron could be in many different places at once. Yet whenever we detect the particle again, we always find it occupying just one position. According to the standard way of thinking, the act of observation causes the wave function to instantaneously “collapse,” snapping the electron into a particular location.
Because the collapse happens everywhere at exactly the same time, it seems to violate the principle of locality—the idea that any change in an object must be caused by another object in its immediate surroundings. This, in turn, leads to some of the puzzles that Albert Einstein called “spooky action at a distance.”
From the very birth of quantum mechanics, physicists saw the collapse of the wave function as a paradoxical and deeply disturbing feature of the theory. Its uneasy mysteries pushed physicists to develop alternative versions of quantum mechanics, with mixed success.
Yet QBism says that there is no paradox. The wave function's collapse is just an observer suddenly and discontinuously revising probability assignments based on new information, in the same way that a doctor would revise a cancer patient's prognosis based on a new CT scan. The quantum system hasn't undergone some strange and inexplicable change; the change is in the wave function, which is chosen by the observer to encapsulate the person's expectations.
We can apply this way of thinking to the famous paradox of Schrödinger's cat. Quantum physicist Erwin Schrödinger imagined a sealed box with a live cat, a vial of poison and a radioactive atom. The atom has a 50–50 chance of decaying within an hour, according to the rules of quantum mechanics. If the atom decays, a hammer will smash the vial and release the poison, killing the cat. If it doesn't, the cat lives.
Now run the experiment—but don't look inside the box. After an hour has gone by, traditional quantum theory would hold that the atom's wave function is in a superposition of two states: decayed and not decayed. But because you haven't yet observed what is inside the box, the superposition extends further. The hammer is also in a superposition, as is the vial of poison. And most grotesquely, the standard quantum-mechanical formalism implies that the cat is in a superposition—it is both alive and dead at the same time.
By insisting that the wave function is a subjective property of the observer, rather than an objective property of the cat in the box, QBism eliminates the puzzle. Common sense says that of course the cat is either alive or dead (and not both). Sure, the wave function of this system represents a superposition of “alive” and “dead,” but a wave function is just a description of the observer's beliefs. Asserting that the cat is truly both alive and dead is akin to a baseball fan saying that the Yankees are stuck in a superposition of both “won” and “lost” until the person sees the box score. It's an absurdity, a megalomaniac's delusion that one's personal state of mind makes the world come into being.
The hope is that by removing the paradoxes, QBism will help physicists home in on the truly fundamental features of quantum theory—whatever they turn out to be—and “prevent them from wasting their time asking silly questions about illusory puzzles,” Mermin says.
QBism was born in a short paper, the preprint published in November 2001 under the title “Quantum Probabilities as Bayesian Probabilities,” by Carlton M. Caves of the University of New Mexico, Christopher A. Fuchs, then at Bell Labs in Murray Hill, N.J., and Rüdiger Schack of the University of London. All three are experienced quantum information theorists, and their respective affiliations with a physics department, an industrial laboratory and a department of mathematics illustrate the interdisciplinary nature of their field.
Since then, Fuchs has moved to the University of Massachusetts Boston and assumed the role of QBism's chief spokesperson. He is a compact Texan with a cheerful disposition. A sandy-colored cowlick at his hairline hints at his irrepressible, irreverent sense of humor. Colleagues are not surprised when he opens an article with the words “In this paper, I try to cause some good-natured trouble.”
The core of Fuchs's style is the conviction that science is quintessentially a communal activity and that profound insight is won only through vigorous intellectual combat. He is a whirlwind of activity, lugging his laptop around the world in a beat-up backpack, organizing conferences, chairing scientific sessions and giving lectures at universities.
In this spirit, Fuchs has pioneered a new form of literature. In 2011 Cambridge University Press published his e-mail correspondence with scientists around the world in a 600-page tome entitled Coming of Age with Quantum Information. As it chronicles the birth pangs of QBism, it offers a glimpse of how theoretical physics is created by real-life, warm-blooded human beings, not the two-dimensional creatures of Wikipedia. The book also documents Fuchs's conviction, contrary to most scientists, that philosophy matters, not only in the way in which it influences physics but also in the manner in which it is informed by the profound insights of physics—or should be.
Fuchs's openness to philosophical concerns becomes clear when you consider how QBism forces us to reconsider what is meant by probability. Probability is like “time”: we know what it is, until we are asked to define it. Sure, the 50 percent probability of throwing heads with a fair coin implies something about 100 tosses, but how does that intuition help to make sense of the proposition that “the probability of rain this evening is 60 percent” or President Barack Obama's 55/45 assessment, before the event, of the probability of success for the bin Laden operation?
Over the past three centuries two competing definitions of probability have been developed, each with countless variants. The modern, orthodox alternative, called frequentist probability, defines an event's probability as its relative frequency in a series of trials. This number is claimed to be objective and verifiable, as well as directly applicable to scientific experiments. The typical example is the coin toss: in a large number of throws, about half will be heads, so the probability for finding heads is approximately ½. (To avoid the vague words “large,” “about” and “approximately,” the definition is refined to require an infinite number of tosses, in which case the probability takes on its exact value of ½. Unfortunately, the value also becomes unverifiable at this point and thereby loses its claim to objectivity.) Applying this definition to weather prediction, one might count real or simulated weather patterns, but as far as President Obama's hunch is concerned, the frequency interpretation is useless—the bin Laden mission was manifestly irreproducible.
The older point of view, Bayesian probability, is named after 18th-century English clergyman Thomas Bayes, whose ideas were perfected and promulgated by French physicist Pierre-Simon Laplace. In contrast to frequentist probability, Bayesian probability is subjective, a measurement of the degree of belief that an event will occur. It is a numerical measure of how an agent would bet on the outcome of the event. In simple cases such as coin tosses, frequentist and Bayesian probabilities agree. For the prediction of the weather or of the outcome of a military action, the Bayesian, unlike the frequentist, is at liberty to combine quantitative statistical information with intuitive estimates based on previous experience.
The Bayesian interpretation easily deals with single cases, about which frequentism is silent, and avoids the pitfalls of infinity, but its real power is more specific. On the basis of this interpretation, probability assignments are subject to change because degrees of belief are not fixed. A weather forecaster who is a frequentist would have no trouble calculating the likelihood of rain if the region has had a stable, predictable climate for many years. But in the case of a sudden change, such as a drought, for which there are little data, a Bayesian forecaster is better equipped to account for the new information and the climate condition.
Central to the theory is an explicit formula, called Bayes's rule, for calculating the effect of new information on the estimate of a probability. For example, when a patient is suspected of having cancer, the physician assigns an initial probability, called the prior, based on data such as the known incidence of the disease in the general population, the patient's family history and other relevant factors. On receiving the patient's test results, the doctor then updates this probability using Bayes's rule. The resulting number is no more and no less than the doctor's personal degree of belief.
Most physicists profess their faith in frequentist rather than Bayesian probability, simply because they have been taught to shun subjectivity. But when it comes to making a prediction, the Bayesian approach rules, says Marcus Appleby, a mathematician at the University of Sydney, who credits Fuchs with convincing him of the significance of Bayesian probability.
Appleby points out that we would consider it crazy to bet in a lottery after learning that the same person has won it every week for 10 years even though a strict frequentist would argue that the results of prior draws have no effect on future outcomes. In practice, no one would ignore the outcome of the previous weeks. Instead the commonsense move would be to adopt the Bayesian viewpoint, update our knowledge and act according to the best available evidence.
Rewriting Quantum Rules
Although QBism negates the reality of the wave function, it is not some nihilistic theory that negates all reality, emphasizes QBism co-author Schack. The quantum system examined by an observer is indeed very real, he notes. Philosophically, Mermin says, QBism suggests a split or boundary between the world in which the observer lives and that person's experience of that world, the latter described by a wave function.
Mathematically, Fuchs recently made an important discovery that could help cement QBism's stake as a valid interpretation of probability and quantum theory. The finding has to do with an empirical formula, known as the Born rule, that allows experimentalists to use the wave function of a system to calculate the probability of observing a quantum event in that system. (In technical terms, the Born rule says that we can measure the likelihood of finding a quantum system having property X by taking the square of the magnitude of the wave function assigned to X.) Fuchs demonstrated that the Born rule could be rewritten almost entirely in terms of the language of probability theory, without referring to a wave function. The Born rule used to be the bridge that connected wave functions to the results of experiments; now Fuchs has shown that we can predict the results of experiments using probabilities alone.
For Fuchs, the new expression of the Born rule provides another hint that the wave function is just a tool that tells observers how to calculate their personal beliefs, or probabilities, about the quantum world around them. “The Born rule in these lights is an addition to Bayesian probability, not in the sense of a supplier of some kind of more-objective probabilities, but in the sense of giving extra rules to guide the agent's behavior when he interacts with the physical world,” Fuchs writes.
The simplicity of the new equation is striking. Except for one tiny detail, it resembles the law of total probability, the logical requirement that the probabilities for all possible outcomes add up to unity—for example, for a coin flip, the probability of landing on heads (½) plus the probability of landing on tails (½) must equal 1. The deviant detail—the one and only reference to quantum mechanics in this prescription for how to calculate probabilities in quantum theory—is the appearance of d, the quantum dimension of the system. Dimension in this sense does not refer to length or width but to the number of states a quantum system can occupy. For instance, a single electron that can have either spin up or spin down would have a quantum dimension of 2.
Fuchs points out that quantum dimension is an intrinsic, irreducible attribute that characterizes the “quantum nature” of a system, in the same way that the mass of an object characterizes its gravitational and inertial properties. Although d is implicit in all quantum-mechanical calculations, its explicit, prominent appearance in a fundamental equation is unprecedented. With the Born rule in its new coat, Fuchs hopes to have discovered the key to a new perspective on quantum mechanics. “I toy,” he confesses, “with the idea of [the Born rule] being the most significant ‘axiom’ of all for quantum theory.”
A New Reality
One of the criticisms of QBism is that it is unable to explain complex macroscopic phenomena in terms of more primitive microscopic ones in the way that conventional quantum mechanics does. The most direct way of meeting that challenge is for QBism to succeed in its stated aim of building the standard theory of quantum mechanics on a foundation of new, compelling assumptions.
That goal has yet to be reached, but even now QBism offers a new view of physical reality. By interpreting the wave function as personal degrees of belief, it gives precise, mathematical meaning to Bohr's intuition that “physics concerns what we can say about nature.” And proponents of QBism embrace the notion that until an experiment is performed, its outcome simply does not exist.
Before an electron is actually observed, for example, the particle exists and moves, but it does not have a speed or a position. Those “properties” have meaning only following an observation; it is the act of measurement that creates them. As participants in quantum experiments, we thus become active contributors to the ongoing creation of the universe.