Even as the Obama administration moves ahead with modest plans to tackle global warming, the public relations battle on the issue is as fierce as ever. Some recent scientific stumbles haven't helped. In fact, they have given fodder to climate change skeptics, some of whom have seized on the errors to attack the credibility of scientists and sway public opinion.
Many scientific organizations, such as the NASA Goddard Institute for Space Studies, now put data (some near real-time) on their Web sites. The information ranges from raw numbers from weather stations to computed values of, for instance, monthly global temperature anomalies, which represent temperature deviations from a historical average. Typically researchers make corrections and adjustments as they check equipment and replicate experiments.
In today’s politically charged environment, though, these routine corrections have become ammunition in the warming war. For example, last November Internet users found that raw data erroneously replicated from Russian weather stations contributed to a suspiciously high temperature anomaly that Goddard published. Two years ago the blog Climate Audit, run by amateur scientists and self-described “science auditor” Steve McIntyre, found that an error in a computer algorithm had ranked 1998 as the warmest U.S. year, instead of the correct 1934. (The change did not significantly affect global values: 1998 was still the earth’s warmest year as ranked by satellites, although Goddard has 2005 as slightly warmer.)
But perhaps the mistake that got the most publicity for skeptics happened in February as an automated system of the U.S. National Snow and Ice Data Center (NSIDC) published information on the extent of Arctic sea ice. It contained a small but strange hitch indicating that enough ice to cover California was suddenly gone. Internet readers pounced, sending e-mails to the center and also to skeptical bloggers such as meteorologist Anthony Watts. His blog, Watt’s Up with That?, is read daily by about 21,000 people around the world (according to Quantcast, which compiles Web site statistics), and Watts’s post about the error mushroomed across the Web. Within hours the NSIDC withdrew the data, ultimately finding that the glitch resulted from a faulty sensor on a satellite. The NSIDC scientists admitted the mistake, corrected the problem using a different sensor and audited all past data.
But the public-relations damage was done. Skeptical bloggers and their readers called the NSIDC’s competence into question and accused it of tweaking data. The NSIDC sent out a press release pointing out that real-time data are always less reliable than thoroughly reviewed archived data.
Word of the otherwise prosaic issue spread via news reports, and the NSIDC took its lumps. “We were too naive,” admits Walt Meier, a researcher at the center. “We weren’t prepared for how closely people were watching.” The science community knows that such adjustments happen all the time, he says, but “the undermining of public confidence in our data comes from ignorance of use.” But he still believes that open-source data are “ultimately a great thing.”
Marc Morano, executive director of the dissenting site Climate Depot, says, “I think the fluctuations and errors of a few data sets are important.” But drawing attention to these errors, he argues, is not the main reason skeptics are gaining ground.
Rather he believes that “lack of warming in recent years” has helped his cause—although this decade is the hottest in recorded history, there hasn’t been a record-breaking year in 10 years. Moreover, recent papers suggest that natural climate fluctuations might continue to mask the expected warming trend for up to three decades. He also notes the “sheer number of scientists speaking out to dissent for the first time,” although a University of Illinois survey in January of some 3,000 scientists found that 97 percent of them think humans play a role in climate change.