What bothers people about this interpretation is its conclusion that we are perpetually dividing into multiple copies, which may have ghastly implications as well as being bizarre.
Also known as the De Broglie–Bohm interpretation or the pilot wave interpretation.
This theory postulates that every particle not only has a wave function but also exists as an actual particle riding along at some precise but unknown location on the wave and being guided by it. How the wave guides the particle is described by a new equation that is introduced to accompany the standard Schrödinger equation. The randomness of quantum measurements comes about because we cannot know exactly where a particle started out. The theory was proposed by David Bohm in 1952 (a few years before Everett’s theory), extending a theory of Louis De Broglie’s from 1927.
Changing the Rules
Some theorists seek to find a mechanism that causes the “collapse” of the wave function from a superposition of possibilities to a single outcome. For example, Roger Penrose has proposed that gravitational effects may play this role. Other models, such as the Ghirardi-Rimini-Weber theory, introduce specific modifications to the Schrödinger equation. By differing from standard quantum theory, such models in principle might be falsifiable by experiment (or conversely, standard theory could be falsified in their favor).
This is not an interpretation, but it is an important element of the modern understanding of quantum mechanics. It expands upon the kind of mathematical analysis that led Everett to his interpretation, because it analyzes the effect that stray quantum interactions with the surrounding environment have on a system in a superposition. The chief conclusion is that the almost unstoppable loss of information through these channels “decoheres” a quantum superposition, making it more like an ordinary classical state. It explains very well why we see the classical world that we do, and clarifies the requirements to keep quantum effects manifest in the lab.
Copenhagenists can point to decoherence as an explanation of what makes large classical systems different from small quantum systems (in general, large systems decohere much more readily and rapidly than tiny ones). Everettians can point to it as a more complete explanation of how the parallel branches form and become independent. But best of all, decoherence can be studied experimentally, and a very active area of quantum research is confirming it and exploring it in ever greater detail.
This scheme analyzes sequences of states of a system (which may include the whole universe), to find what questions can be consistently answered about the system, such as “was the particle at A or B at time T?” The measurement problem, however, is not resolved: the question of which histories actually happen remains a matter of probabilities just as with the standard Copenhagenist approach.
Is it Real?
In some respects the decision between a Copenhagenist and an Everettian viewpoint boils down to a basic question: Is the wave function real or is it just information? If it is “real”—in some sense the universe really consists of quantum waves propagating around—then one tends to be driven to an Everettian viewpoint; the “collapses” that wave functions must undergo to produce the one reality that we see are too problematic. But if the wave function is just information, for example, a representation of what an experimenter knows about a system, then that “collapse” is completely natural. Imagine the standard classical scenario of flipping a coin. Before you look at it, your knowledge of its state is “50% chance of heads, 50% chance of tails.” When you look, your knowledge instantaneously changes to, say, “100% heads, 0% tails.”