BOULDER, Colo. – As the West has warmed and dried over the past 30 years, headlines describing fire season have grown ever more apocalyptic: "epic" dryness, "monster" fires, new records for damage and devastation.
This year is no exception. The Waldo Canyon Fire has incinerated hundreds of homes in Colorado Springs, and every indication points to another big, early start to the wildfire season.
Recent research, however, suggests these severe conflagrations could be a prelude. Climate stressors are putting increasing pressure on a "fire deficit" the West has accumulated over the past 100 years, say scientists who have compared today's burn rates with fire activity over thousands of years. As the West continues to warm, that debt will come due – possibly with interest – triggering fires that are fiercer and harder to contain, they warn.
"If you just look at what the current climate is like, the rate of biomass burning should be much higher than what we've observed over the 20th century," said Patrick Bartlein, a climatologist at the University of Oregon and a co-author of the study, published earlier this year in the journal Proceedings of the National Academy of Sciences.
Fire activity for the last 100 years has been remarkably low compared to the past three millennia, Bartlein and his colleagues concluded. Given just how much the West has warmed since the early 1900s, these scientists believe fire should be more common and widespread than it has been.
Fire activity hasn't grown with the warming climate, Bartlein said, because since about 1900 Westerners have worked hard to keep fire out. "This divergence between climate and fire activity is unsustainable," he added. "Eventually, nature will catch up."
In their research, Bartlein and his colleagues used a record of past fire activity obtained from cores of sediments extracted from the bottom of mountain lakes. These sediments contain layers of charcoal particles that fell from smoke plumes or were carried into lakes by streams after wildfires. These bands can be read like a timeline. Combined with other data, they allow scientists to reconstruct 3,000 years of wildfire history in the West.
Overlaying that history on a record of natural climate change, the researchers saw a pattern: More fire when the climate was warmer and drier, such as during the period known as the Medieval Climate Anomaly from about 950 to 1250; less fire during the cool and moist Little Ice Age from 1400 to 1700.
In other words, for most of the past 3,000 years, climate has determined fire activity.
That changed as the 19th century ended. The charcoal record shows that fire activity plummeted – and continued to do so in the 20th century even as a strong signal of global warming from human activities emerged.
Bartlein and his colleagues point to a number of factors for the change, including the introduction of cattle, which reduced fuel loads by eating and trampling grasses; fragmentation of the landscape; and vigorous suppression of any fires on public lands that did break out.
Recent trends suggest the fire deficit is now being paid back. Since the 1980s, fire frequency in the West has increased more than 300 percent, and the annual acreage burned has jumped 500 percent, according to Anthony Westerling of the University of California’s Sierra Nevada Research Institute.
Thomas Veblen, a researcher at the University of Colorado who studies wildfire and climate but who was not involved in the fire debt research, cautions that the broad-brush strokes of the fire deficit picture do not reflect the situation everywhere in the West and in all mountain ecosystems. "It aggregates data over an enormous area," he said.
There are, for example, significant differences in fire behavior between the naturally open, grassy ponderosa forests common in the foothills of the Rockies, and the denser Douglas fir and lodge pole pine forests higher up. These differences – key to managing wildfire risks – are not captured by the fire deficit research.