The “hockey stick” graph has been both a linchpin and target in the climate change debate. As a plot of average Northern Hemisphere temperature from two millennia ago to the present, it stays relatively flat until the 20th century, when it rises up sharply, like the blade of an upturned hockey stick. Warming skeptics have long decried how the temperatures were inferred, but a new reconstruction of the past 600 years, using an entirely different method, finds similar results and may help remove lingering doubts.
The hockey stick came to life in 1998 thanks to the work of Michael Mann, now at Pennsylvania State University, and his colleagues (and many other climate scientists who subsequently refined the graph). Reconstructing historical temperatures is difficult: investigators must combine information from tree rings, coral drilling, pinecones, ice cores and other natural records and then convert them to temperatures at specific times and places in the past. Such proxies for temperature can be sparse or incomplete, both geographically and through time. Mann’s method used the overlap, where it exists, of recent proxy data and instrument data (such as from thermometers) to estimate relations between them. It calculates earlier temperatures using a mathematical extrapolation technique [see “Behind the Hockey Stick,” by David Appell, Insights; Scientific American, March 2005].
Martin Tingley of Harvard University calls his approach “much easier to handle and to propagate uncertainties”—that is, to calculate how the inherent limitations of the data affect the temperature calculated at any given time. The method can easily be modified to answer other questions in climate science, such as about precipitation and drought, and can even make projections into the future given rates of buildup of carbon dioxide in the atmosphere. Written with his thesis adviser Peter Huybers, his paper was submitted to the Journal of Climate.
Tingley and Huybers’s new method, which Mann describes as “promising,” makes the assumption that nearby proxies can be simply related, or “chained,” either to data from nearby places or to data from the same place taken a few years before or after. For example, temperatures at neighboring places as measured in the last century seem correlated in a way that drops off approximately exponentially, with a “half-distance” (akin to the concept of half-life) of about 4,000 kilometers.
Tingley assumes a simple, linear relation between the proxy data values and the true temperature. This relation is then determined from proxy data and (where they exist) instrument data, using a methodology known as Bayesian statistics. Huybers explains that with Bayesian descriptions, “we attempt to estimate how probable certain temperatures were in the past given the sets of observations available to us.”
The sheer amount of computation, however, is daunting, involving heavy matrix algebra. Initial values for proxies and temperatures (where they have a known overlap) are input, and the methodology works backward to refine the relations at other times. To determine past temperatures, Tingley typically had to manipulate about one million matrices, each consisting of 1,296 columns and 1,296 rows.
Focusing on the past 600 years of proxy data between 45 and 85 degrees north latitude, Tingley’s initial results, presented at a conference earlier this year, find that the 1990s were the warmest decade of the period and that 1995 was the warmest year. (The El Niño year 1998 was the warmest year for North America and Greenland but not for northern Eurasia.) He also found that the 20th century had the largest rate of warming of any century and that the 1600s had the largest rate of change overall (and larger than previous reconstructions), albeit in the cooling direction thanks to the so-called Little Ice Age.