“Some say the world will end in fire, some say in ice,” the poet Robert Frost mused in 1920. Frost famously held “with those who favor fire,” and that poetic view surprisingly coincides with mainstream scientific consensus about the end of the world, which states the sun will in some seven billion to eight billion years evolve into a red giant star that will scorch and perhaps even engulf Earth.

Yet when that happens, Earth will already have been dead for billions of years, and will more resemble present-day Venus. As the sun slowly brightens over time on its path to becoming a red giant, it will eventually cross a critical threshold in which its luminosity surpasses our planet’s ability to dissipate absorbed radiation out into space. At that point, somewhere between one billion and three billion years from now, Earth’s surface temperature will steadily rise until the boiling oceans throw a thick blanket of steamy water vapor around the planet. All that water vapor, itself a potent greenhouse gas, will raise temperatures higher still to cook another greenhouse gas, carbon dioxide, out of Earth’s rocks. The end result will be a “runaway greenhouse” in which the planet loses its water to space and bakes beneath a crushing atmosphere of almost pure carbon dioxide.

Earlier this year, for the first time in human history, atmospheric carbon dioxide reached 400 parts per million (ppm), surpassing a preindustrial average of about 280 ppm that has prevailed with slight variations for the past several million years. Pessimistic projections from the United Nations Intergovernmental Panel on Climate Change forecast atmospheric carbon dioxide levels soaring beyond 1,000 ppm later this century. As the world warms not from a brightening sun but from fossil fuel–burning humans, some scientists have wondered just how close our planet might be to tumbling into a runaway state. Studies in the 1980s and ‘90s suggested the present-day Earth was safe against a runaway, but a paper published this week in Nature Geoscience argues that “the runaway greenhouse may be much easier to initiate than previously thought.” Indeed, the study suggests that without the cooling effects of certain types of clouds, modern Earth would already be well on its way to broiling like Venus. (Scientific American is part of Nature Publishing Group.)

According to the study’s lead author, Colin Goldblatt of the University of Victoria in British Columbia, the disturbing result hinges less on carbon dioxide and more on humble water vapor, which recent investigations have shown absorbs solar radiation more efficiently than previously believed. “The old answer was that a runaway on Earth right now was theoretically impossible,” Goldblatt says. “Even if you evaporated a big chunk of ocean it would just rain back out, because the water vapor would radiate away more thermal energy than it absorbed through sunlight. Our new calculations show that a water vapor–rich atmosphere absorbs more sunlight and lets out less heat than previously thought, enough to put the Earth into a runaway from which there would be no return.”

The upside of the new study is that even though a climate runaway may be possible in theory, it remains very difficult to cause in practice through human greenhouse gas emissions. “We’ve estimated how much carbon dioxide would be required to get this steamy atmosphere, and the answer is about 30,000 ppm of atmospheric carbon dioxide, which is actually good news in terms of anthropogenic climate change,” Goldblatt says. Thirty thousand ppm is about 10 times more carbon dioxide than most experts estimate could be released from burning all available fossil fuels, he notes, although such high values could in theory be reached by releasing large amounts of carbon dioxide from the Earth’s vast deposits of limestone and other carbonate rocks.

A cloudy outlook

Not everyone is convinced Goldblatt’s result is valid, however. James Kasting, a geoscientist at The Pennsylvania State University, suspects that even in theory an anthropogenic runaway remains out of reach of humanity. Kasting performed many of the earlier seminal studies that seemed to rule out a present-day runaway, and with his student Ramses Ramirez is currently polishing a new study that reinforces those conclusions. No matter how much carbon dioxide is pumped into the present-day Earth’s atmosphere in Kasting’s models, the resulting heating is insufficient to cause the planet to rapidly boil off its oceans. “The bottom line,” Kasting says, “is that we do not get a runaway.”

Like Goldblatt’s team, Kasting’s group studies Earth’s climate using a one-dimensional model that simulates the absorption, transmission and reflection of sunlight by a single surface-to-space strip of atmosphere. These models’ sophisticated treatment of light’s interactions with air closely reproduce the observed warming effects of carbon dioxide, water vapor and other greenhouse gases, yet they contain only the crudest approximations of Earth’s changing weather and surface. Such models are particularly poor at accounting for the complex effects of clouds, which, depending on where and how they form, can either cool or heat the planet: Thick, low-lying clouds tend to reduce temperatures by reflecting greater amounts of sunlight back to space, whereas high, thin clouds will warm the planet by letting light pass through then trapping more of the absorbed heat. The differences between Kasting’s and Goldblatt’s conclusions largely boil down to Kasting’s 1-D approximations of clouds providing slightly more cooling whereas Goldblatt’s provide slightly less.

Three-dimensional modeling is the only way around this impasse, yet current 3-D climate models aren’t up to the task of simulating how Earth’s clouds and weather will change within a very steamy or CO2-rich atmosphere. “Using today’s best models to address these extremes is like trying to drive up a mountain in a Honda Civic,” Goldblatt says. “A Civic can take you coast to coast on paved roads, but take it off-road and you run into problems. Today’s models are like that right now—they aren’t designed for extreme atmospheres. If you want to model the runaway greenhouse, you need the equivalent of a Humvee for your climate model that will take you to these wild places.”

Kasting’s group recently received funding from NASA to work with other teams to develop better 3-D models, and a handful of other research groups in Europe are also pursuing similar goals.

Out of the fire, into the frying pan

Outside of better models, other useful constraints on the runaway greenhouse scenario come from the Earth’s long history. Measurements of 56-million-year-old sedimentary rocks have revealed an event during the mid-Cenozoic era called the Paleocene–Eocene Thermal Maximum (PETM) in which a millennia-scale pulse of greenhouse gases warmed the globe. The PETM pulse seems to have been roughly equivalent to what humans could release through burning all recoverable fossil fuels, and may have warmed the planet in excess of 10 degrees Celsius, but clearly no catastrophic runaway occurred, for otherwise we would not now be here. If it didn’t happen then, many researchers suggest, it won’t happen now from a similar, anthropogenic spike of greenhouse gas.

“All these geological records tell us that even with very high levels of atmospheric carbon dioxide in the past, Earth avoided runaway,” Goldblatt acknowledges. “But that doesn’t tell us how much margin of error we have today or how close things came in the past. It’s a bit like walking around on top of a foggy cliff and not knowing whether you’re a meter or a kilometer from the edge. Even simple modeling can let you work out some hard limits to help guide behavior.”

Already a wealth of modeling suggests that easily achievable amounts of global warming would fall far outside safety margins long before Earth began any runaway transition to Venus. In 2010 a study from Steven Sherwood at the University of New South Wales in Sydney and Matthew Huber at Purdue University calculated that warming slightly in excess of 10 degrees C—like that of the PETM and of pessimistic scenarios for future fossil-fuel burning—could render large portions of the planet uninhabitable for many creatures. Unprotected humans and other warm-blooded mammals can overheat and die in humid conditions hotter than about 35 degrees C, because their metabolisms produce more heat than can be easily dissipated into the surrounding air. The latest results from Kasting’s group, which are still under review, suggest that such conditions could prevail across much of the planet if human civilization burns enough fossil fuel to quadruple atmospheric levels of carbon dioxide.

Reaching such dangerous levels “is certainly doable,” Huber says. “It’s our decision whether or not to dedicate the next century to burning these reserves.... There used to be subtropical forests near the poles 50 million years ago, and that doesn’t sound so bad. But the fossil record closer to the equator is really poor, and that may be an indication that life was extremely stressed during these warm periods. If over half the surface area of the planet becomes inhospitable, it will not render Earth uninhabitable, but it will be unrecognizable and existentially challenging for the majority of the people, species and communities on Earth.”

As nightmarish as a runaway greenhouse seems, whether or not modern Earth is susceptible to it should perhaps be seen as essentially an academic point. Microbes could endure and even flourish on a planet at the brink of runaway, but people would still be steam-cooked whether or not such a hothouse world tipped over into a more Venusian climate. Leaving aside other effects of global warming like rising seas, stronger storms, longer droughts and plummeting biodiversity, Kasting says, “the problem of heat stress alone could become lethal to humans well before any runaway happens, and that danger may be much closer than previously realized. This is serious enough to warrant our full attention.”