On a sunny spring day one of us (Dixon) entered the London Underground at the Mile End station on his way to Heathrow Airport. Eyeing a stranger, one of more than three million daily passengers on the Tube, he idly wondered: What is the probability the stranger would emerge at, say, Wimbledon? How could you ever figure that out, given that the person could take any number of routes? As he thought about it, he realized that the question was similar to the knotty problems that face particle physicists who seek to make predictions for particle collisions in modern experiments.
The Large Hadron Collider (LHC) at CERN near Geneva, the premier discovery machine of our age, smashes together protons traveling at nearly the speed of light to study the debris from their collisions. Building the collider and its detectors pushed technology to its limits. Interpreting what the detectors see is an equally great, if less visible, challenge. At first glance, that seems rather strange. The Standard Model of elementary particles is well established, and theorists routinely apply it to predict the outcomes of experiments. To do so, we rely on a calculational technique developed more than 60 years ago by the renowned physicist Richard Feynman. Every particle physicist learns Feynman's technique in graduate school. Every book and magazine article about particle physics for the public is based on Feynman's concepts.
Yet his technique has become obsolete for state-of-the-art problems. It provides an intuitive, approximate way to grasp the very simplest processes but is hopelessly laborious for more complicated ones or for high-precision calculations. Predicting what will emerge from a particle collision is even more daunting than predicting where a subway passenger will go. All the computers in the world working together would be unable to determine the outcome of even a fairly common collision at the LHC. If theorists cannot make precise predictions for known laws of physics and known forms of matter, what hope do we have of telling when the collider has seen something truly new? For all we know, the LHC may already have found answers to some of the greatest mysteries of nature, and we remain in the dark just because we cannot solve the equations of the Standard Model accurately enough.
In recent years the three of us and our colleagues have developed a new way of analyzing particle processes that bypasses the complexity of Feynman's technique. Called the unitarity method, it amounts to a highly economical way of predicting what a subway passenger will do by realizing that the passenger's options at each decision point are actually rather limited and can be broken down into probabilities for sequences of actions. Many theoretical problems in particle physics that were intractable have been cracked wide open by the new idea. Their solutions allow us to understand in unprecedented detail what our current theory of nature predicts so that we will know a new discovery when we see it. The method has also produced a wealth of results for an idealized cousin of the Standard Model that is of special interest to physicists as a stepping-stone to the ultimate theory of nature.
The unitarity approach is more than a helpful calculational trick. It suggests a radical new vision of theories of parti-cle interactions that are governed by unexpected symmetries, reflecting an underappreciated elegance of the Standard Model. Notably, it has revealed a strange twist in the decades-old effort to unite quantum theory and Einstein's general theory of relativity into a quantum theory of gravity. Up until the 1970s, physicists assumed that gravity behaves like the other forces of nature and sought to extend our existing theories to cover it. When they applied Feynman's technique, however, they either got nonsensical results or were stymied by the math. Gravity, it seemed, was not like the other forces after all. Discouraged, physicists turned to more revolutionary ideas such as supersymmetry and, later, string theory.
The unitarity method, however, has allowed us to actually do calculations that were contemplated in the 1980s but seemed hopelessly beyond reach then. We have found that some of the supposed inconsistencies are in fact absent. Gravity does look like the other forces, albeit in an unexpected way—it behaves like a “double copy” of the strong subnuclear force that binds the constituents of nuclei together. The strong force is transmitted by particles known as gluons; gravity should be transmitted by particles known as gravitons. The new picture is that each graviton behaves like two gluons stitched together. This concept is quite strange, and even experts do not yet have a good mental image of what it means. Nevertheless, the double-copy property provides a fresh perspective on how gravity might be unified with the other known forces.
From Trees to Thickets
What made Feynman's technique so compelling and useful is that it gives a precise graphical recipe for extremely complicated calculations. It is based on diagrams that give a visual picture of two or more particles colliding or scattering off one another. At every research institution studying elementary particle physics, you can find blackboards covered by these diagrams. To make quantitative predictions, a theorist draws a set of diagrams, each representing one of the conceivable ways a collision may unfold; it is analogous to one of the possible routes an Underground rider might take. Following a set of detailed instructions that Feynman and his colleagues, notably Freeman Dyson, laid down, the theorist then assigns a number to each diagram, giving the probability the event will take place in that way.
The downside is that the number of diagrams one can draw is enormous—in principle, infinite. In the applications for which Feynman originally developed his rules, this disadvantage did not matter. He was studying quantum electrodynamics (QED), which describes how electrons interact with photons. The interaction is governed by a quantity, the coupling, approximately equal to 1/137. The smallness of the coupling ensures that complicated diagrams receive a low weighting in the calculation and can often be ignored altogether. That is like saying that an Underground rider is usually better off taking a fairly simple route.
Two decades later physicists extended Feynman's technique to the strong subnuclear force. By analogy with QED, the theory of the strong force is known as quantum chromodynamics (QCD). QCD is also governed by a coupling, but as the word “strong” suggests, its value is higher than that of the electromagnetic coupling. On the face of it, a larger coupling increases the number of complicated diagrams that theorists must consider—like an Underground rider who is willing to take very circuitous routes, making it hard to predict what he or she will do. Fortunately, at very short distances, including the distances relevant for collisions at the LHC, the coupling diminishes in value and, for the very simplest collisions, theorists can again get away with considering only uncomplicated diagrams.
For messy collisions, though, the full complexity of the Feynman technique comes rushing in. Feynman diagrams are classified by the number of external lines and closed loops they have [see box on opposite page]. Loops represent one of the quintessential features of quantum theory: virtual particles. Though not directly observable, virtual particles have a measurable effect on the strength of forces. They obey all the usual laws of nature, such as the conservation of energy and of momentum, with one caveat: their mass can differ from that of the corresponding “real” (that is, directly observed) particles. Loops represent their ephemeral life cycle: they pop into existence, move a short distance, then vanish again. Their mass determines their life expectancy: the heavier they are, the shorter they live.
The simplest Feynman diagrams ignore virtual particles; they have no closed loops and are called tree diagrams. In quantum electrodynamics, the simplest diagram of all shows two electrons repelling each other by exchanging a photon. Progressively more complicated diagrams add loops one by one. Physicists refer to this additive procedure as “perturbative,” meaning that we start with some approximate estimate (represented by the tree diagrams) and gradually perturb it by adding refinements (the loops). For instance, as the photon travels between the two electrons, it can spontaneously split into a virtual electron and virtual antielectron, which live a short while before annihilating each other, producing a photon. The photon resumes the journey the original photon had been taking. In the next level of complexity, the electron and antielectron might themselves split temporarily. With increasing numbers of virtual particles, the diagrams describe quantum effects with increasing precision.
Even tree diagrams can be challenging. In the case of QCD, if you were brave enough to consider a collision involving two incoming and eight outgoing gluons, you would need to write down 10 million tree diagrams and calculate a probability for each. An approach called recursion, pioneered in the 1980s by Frits Berends of Leiden University in the Netherlands and Walter Giele, now at Fermilab, tamed the problem for tree diagrams but had no obvious extension to loops. Worse, closed loops make the workload overwhelming. Even a single loop causes an explosion in both the number of diagrams and the complexity of each. The mathematical formulas could fill an encyclopedia. Brute force—harnessing the power of an ever growing number of computers—can fight off the tide of complexity for a while but soon succumbs to increasing numbers of external particles or loops.
Even worse, what started as a concrete way to visualize the microscopic world can cloak it in obscurity. Individual Feynman diagrams are often impenetrably baroque, and when we have to juggle so many of them, we lose track of the essential physics. What is astounding is that the final result, found by summing up all the diagrams, can be quite simple. Different diagrams partially negate one another, and sometimes formulas with millions of terms collapse down to a single term. These cancellations suggest that the diagrams are the wrong tools for the job—like trying to pound in a nail with a feather. There must be a better way.
Beyond Feynman DiagramsOver the years physicists tried out many new techniques to do calculations, each slightly better than the one before it, and gradually the outlines of an alternative to Feynman diagrams took shape. Our own involvement began in the early 1990s, when two of us (Bern and Kosower) showed how string theory could simplify QCD calculations by summarizing all the relevant Feynman diagrams with a single formula. With this formula, the three of us analyzed a particle reaction that had never been understood in detail before: the scattering of two gluons into three gluons, with one virtual-particle loop. This process was very complicated by the standards of the time but could be fully described by an amazingly simple formula, which fit on a single page.
The formula was so simple that, together with David Dunbar, who was at the University of California, Los Angeles, at the time, we found we could understand the scattering almost entirely in terms of a principle called unitarity. Unitarity is the requirement that the probabilities of all possible outcomes add up to 100 percent. (Technically the quantities are not probabilities but square roots of probabilities, but this distinction is not so important here.) Unitarity is implicit in Feynman's technique but tends to be hidden by the complexity of the calculations. So we developed an alternative technique that put it front and center. The idea of basing calculations on unitarity had come up in the 1960s but fell out of favor. As repeatedly happens in science, discarded ideas can come roaring back in a new guise.
The key to the success of the unitarity method is that it avoids the direct use of virtual particles, which are the prime reason that Feynman diagrams get so complicated. Such particles have both real and spurious effects. By definition, the latter have to cancel out of the final result, so they are excess mathematical baggage that physicists are happy to leave behind.
The method can be understood by analogy to a complicated subway system like the London Underground, with multiple paths between any two subway stations. Suppose we want to know the probability that a person entering at Mile End leaves at Wimbledon. The Feynman technique adds up the probabilities of all conceivable paths. All really means all: besides the paths through corridors and tunnels, Feynman diagrams include paths through solid rock where there are no subway lines or walkways. Those unrealistic paths are the analogues of the spurious contributions from virtual-particle loops. They cancel out in the end, but in the intermediate stages of the calculation, we need to keep track of them all.
In the unitarity approach, we consider only paths that make sense. We calculate the probability that a person takes a particular route by subdividing the problem: What is the probability the person goes through a particular turnstile, one pathway or another one, at each step of the journey? This procedure greatly cuts down the size of calculations.
The choice between the Feynman and unitarity methods is not a matter of right and wrong. Both express the same basic physical principles. Both would eventually lead to the same numerical probabilities. But they represent different levels of description. A single Feynman diagram, out of the tens of thousands for a messy collision, is like a single molecule within a droplet of fluid. In principle, you can determine what a fluid will do by tracking all the individual molecules, but that makes sense only for a microscopically small droplet. It is not only cumbersome but also unenlightening. The fluid could be cascading down a hill, but you would scarcely know that from the molecular description. You are better off considering higher-level properties such as fluid velocity, density and pressure. Likewise, instead of viewing a particle collision as built up one by one from individual Feynman diagrams, physicists can think of it holistically. We concentrate on the properties that govern the process as a whole—unitarity, as well as special symmetries to which the unitarity method gives prominence. In special cases, we can make theoretical predictions with perfect precision, which would take an infinite amount of time with Feynman diagrams.
The advantages go even further. After we developed the unitarity method for virtual-particle loops, another team, then at the Institute for Advanced Study in Princeton, N.J.—Ruth Britto, Freddy Cachazo, Bo Feng and Edward Witten—added a complementary twist. The theorists thought about tree diagrams again and computed the probability of a collision involving, say, five particles in terms of the probability of a collision of four particles, followed by one particle splitting into two. That is a stunning conclusion because a five-particle collision usually looks very different from these two sequential collisions. In more ways than one, we can subdivide daunting particle problems into simpler pieces.
Smashing WatchesProton collisions at the LHC are exceptionally complex. Feynman himself once compared such collisions to figuring out how Swiss watches work by smashing them together, and his technique struggles to track what goes on during these encounters. Protons are not elementary particles but little balls of quarks and gluons bound together by the strong subnuclear force. When they slam together, quarks can bounce off quarks, quarks off gluons, gluons off gluons. Quarks and gluons can split into still more quarks and gluons. They ultimately clump together in composite particles that shoot out of the collider in narrow sprays physicists call jets.
Somewhere buried in that mess may be things that human beings have never seen before: new particles, new symmetries, maybe even new dimensions of spacetime. But sifting them out will be tough. To our instruments, exotic particles look rather like ordinary ones. The differences are small and easily missed. With the unitarity method, we can describe ordinary physics so precisely that extraordinary physics will stand out.
For instance, Joe Incandela of U.C. Santa Barbara, who is currently the spokesperson for the 2,000-plus-physicist CMS experiment at the LHC, came to us with a question about his team's search for exotic particles that make up cosmic dark matter, the mysterious stuff that astronomers think is out there but that physicists have yet to identify. Any such particles the LHC produces would elude the CMS detector, leaving the impression that some energy had gone missing. Unfortunately, an apparent loss of energy does not, in itself, mean that the LHC has synthesized dark matter. For instance, the LHC frequently produces an ordinary particle called the Z boson, and one fifth of the time it decays into two neutrinos, which also interact very weakly and escape the detector without registering. How could one predict the number of Standard Model particles whose effects mimic dark particles?
Incandela's group proposed a solution: take the number of photons the CMS detector records, extrapolate to the number of events involving neutrinos and see whether they fully explain the apparent energy loss. If not, the LHC might be creating dark matter. This idea typifies the indirect estimates experimental physicists are always having to make because they lack the ability to observe certain types of particles directly. But to pull it off, Incandela's group needed to know precisely how the number of photons related to the number of neutrinos. Unless the precision was high enough, the backdoor strategy would fail. With several colleagues, we studied the problem using the new theoretical tools and were able to assure Incandela that the precision was sufficiently high.
This success has inspired us to push forward with more ambitious calculations. As is common in modern particle physics, we work with collaborators worldwide, including Fernando Febres Cordero of Simón Bolívar University in Caracas, Venezuela, Harald Ita, now at the University of Freiburg in Germany, Daniel Maître of Durham University in England, Stefan Höche of SLAC and Kemal Ozeren of U.C.L.A. Together we made precise predictions for the probability that LHC collisions would produce a pair of neutrinos, along with four jets. Using Feynman diagrams, these calculations would have been too imposing even for a large team of physicists working hard for a decade assisted by state-of-the-art computers. The unitarity method let us do them in less than a year. To our delight, another LHC team, the ATLAS collaboration, has already compared our predictions with its data, and the results are in excellent agreement. Going forward, experimenters will use these results to search for new physics.
The unitarity method has also aided the search for the recently discovered Higgs-like particle. A sign of the Higgs is the production of a single electron, a pair of jets and a neutrino, the neutrino again leaving the impression that energy has gone missing. The same outcome can also arise from particle reactions not involving the Higgs. One of our first uses of the unitarity method was to calculate the precise probability of these confounding reactions.
Back to GravityAn even more impressive use of the unitarity method is to study quantum gravity. For physicists to develop a fully consistent theory of nature, we must find a way to fit gravity into a quantum-mechanical framework. If gravity behaves like the other forces of nature, it should be transmitted by graviton particles. Gravitons would collide and scatter just as other particles do, and we could draw Feynman diagrams for them. Yet attempts in the mid-1980s to describe graviton scattering by quantizing Einstein's theory in the simplest possible way led to nonsensical predictions, such as infinite values for quantities that should clearly be finite. Infinite quantities, per se, are not the problem. They can arise at intermediate stages of calculations even in a well-behaved theory such as the Standard Model, but they should cancel out of any quantity that is potentially measurable. For gravity, no such cancellations appeared to occur. In concrete terms, it means that the quantum fluctuations of space and time, dubbed “spacetime foam” by the late quantum gravity pioneer John Wheeler, spiral out of control.
One possible explanation is that nature contains undiscovered particles that rein in these quantum effects. This idea, embodied in so-called supergravity theories, was studied intensively during the 1970s and early 1980s. But the excitement died down when indirect arguments suggested that nonsensical infinities would still arise with three or more virtual-particle loops. It seemed that supergravity was doomed to failure.
This disappointment led many to pursue string theory. String theory is a radical departure from the Standard Model. According to it, particles such as quarks, gluons and gravitons are no longer tiny points but oscillations of one-dimensional strings. Particle interactions are spread out over the strings rather than concentrated at a single point, preventing infinities automatically. On the other hand, string theory has encountered its own troubles; for instance, it does not make definitive theoretical predictions for observable phenomena.
Double TroubleIn the mid-1990s Stephen Hawking of the University of Cambridge advocated giving supergravity theories another look. He pointed out that the 1980s-era studies had taken shortcuts that made their conclusions questionable. But Hawking was unable to convince anyone else because there was a good reason that people had taken those shortcuts: the full calculations were hopelessly beyond the capacities of even the most brilliant math whiz. To know for sure whether a Feynman diagram with three virtual-graviton loops produces infinite quantities, we would need to evaluate 10
The unitarity method has completely changed the situation. Using it, we have conducted a physics version of the Innocence Project and reopened the case against supergravity theory. What would have taken the Feynman technique 10
Even more remarkably, three gravitons interact just like two copies of three interacting gluons. This double-copy property appears to persist no matter how many particles are scattering or how many virtual-particle loops are involved. It means that, figuratively speaking, gravity is the square of the strong subnuclear interaction. It will take us a while to translate the mathematics into physical insight and check whether it is true under all conditions. For now the crucial point is that gravity may not be so different from the other forces of nature.
As is common in science, after each debate is settled, another erupts. Immediately after our calculations for three loops, skeptics wondered whether trouble might appear at four loops. As also happens frequently, bottles of wine were bet on the outcome of the calculation: an Italian Barolo against a Napa Valley Chardonnay. When we did the calculation, we found no hints of difficulties, settling at least this debate (and popping a Barolo cork).
Is supergravity theory completely free of infinities? Or does its high degree of symmetry merely curb some of its excesses at a small number of loops? In the latter case, trouble should creep in at five loops; by seven loops, quantum effects should grow strong enough to produce infinities. David Gross of U.C. Santa Barbara has put up a bottle of California Zinfandel if no seven-loop infinities arise. To settle this latest bet, some of us have embarked on new calculations. An absence of seven-loop infinities would astound the skeptics and might finally persuade them that supergravity could be self-consistent. Even if it is, though, the theory does not capture other kinds of effects, called nonperturbative, that are too tiny to be seen in the loop-by-loop approach we have been following. Those effects may still require an even deeper theory to handle, perhaps string theory.
Physicists often like to think of new theories as emerging from the bold brushstrokes of new principles—relativity, quantum mechanics, symmetry. But sometimes those theories emerge from a careful reexamination of the principles we already know. The quiet revolution in our understanding of particle collisions has enabled us to work out the consequences of the Standard Model in remarkable detail, leading to significant improvements in our potential to discover physics beyond it. Even more surprisingly, it is letting us follow the unexplored implications of old physics, including a once neglected path toward unifying gravity with the other known forces. In many ways, the journey to understanding the secrets of how elementary particles scatter has not been like riding the predictable London Underground at all but more like a journey on the Knight Bus of Harry Potter tales, where you never quite know what will happen next.