Climate Change is Making Wildfires Worse — Here’s How
From North Carolina to California, much of the United States is expected to face increased wildfire risk as rising greenhouse gas emissions bring warmer temperatures and shifting precipitation patterns.
Over the course of nearly three months in 2020, the August Complex Fire, fueled by extreme heat and severe drought conditions, burned more than a million acres across Northern California, destroying hundreds of buildings and forcing thousands of people to evacuate. Less than a year later, the Dixie Fire burned more than 963,000 acres throughout the region, resulting in more than $1 billion in damages.
Large-scale wildfires like the August Complex Fire and Dixie Fire are becoming increasingly common in the United States as climate change accelerates. In fact, since 2000, an annual average of 70,072 wildfires have burned an annual average of 7 million acres across the country. That’s more than double the annual average of 3.3 million acres burned in the 1990s, when a greater number of fires occurred annually.
Robert Scheller, a professor of forestry and environmental resources at NC State’s College of Natural Resources, said greenhouse gas emissions continue to drive changes in the climate, contributing to warmer-than-average surface temperatures and shifting precipitation patterns — trends that are expected to increase the frequency, intensity and duration of wildfires across the U.S.
“Climate change is creating the perfect conditions for larger, more intense wildfires,” said Scheller, who uses geospatial analytics to examine the effects of climate change and human activities on long-term landscape health. “We’re already seeing fires that we didn’t expect to see until 2080.”
As of 2021, the average surface temperature of the U.S. was 54.5 degrees Fahrenheit, 2.5 degrees Fahrenheit above the 20th-century average. That figure is projected to rise by 3 to 12 degrees Fahrenheit by 2100, resulting in more frequent and intense heat waves. Simultaneously, some regions are expected to experience less rainfall than usual. These conditions will worsen droughts, making landscapes across the country more flammable.
Drought conditions reduce the amount of moisture in vegetation. The amount of moisture in vegetation determines how easily it ignites when exposed to heat, whether it’s from lightning or an unattended campfire. With low moisture, vegetation becomes more flammable because the heat doesn’t have to remove as much water to generate combustion — the series of chemical reactions that produce fire.
Much of the U.S. landscape is expected to face increased wildfire risk as droughts intensify. However, California and other western states could experience a greater degree of vulnerability due to decades of fire suppression. For most of the 20th century, prescribed fire was generally prohibited across the region because fire of any kind was viewed as dangerous. This allowed dry vegetation to build up in forests.
“Fire suppression has caused western forests to become denser, so there’s a lot more fuel on the ground,” Scheller said. “Prescribed fire is a great management tool for reducing fuel loads. Unfortunately, many people don’t want land managers using it because they’re afraid it will escape and burn down their houses or cause other property damage.”
Although burning doesn’t come without risk, prescribed fire is planned and conducted under specific environmental conditions. Wildfires, on the other hand, can be much more difficult to control, influenced by a number of unpredictable factors. For example, once a wildfire starts — the vast majority are caused by people — wind can supply it with additional oxygen. This can cause it to spread at a faster rate across the landscape.
Wildfire is particularly hazardous in the wildland-urban interface, which is defined as any area where structures are built near or among natural lands with flammable vegetation. The wildland-urban interface contains more than 60,000 communities in the U.S. but continues to grow by approximately 2 million acres per year. In 2020 alone, wildfires destroyed almost 18,000 structures, 54% of which were homes. North Carolina has the most wildland-urban interface acres of any state in the U.S.
The U.S government has spent about $1.9 billion per year to fight wildfires since 2016. Recently, the Department of Agriculture unveiled a 10-year, $50 billion plan to reduce the risk of wildfires on up to 50 million acres of land bordering vulnerable communities across 11 western states. The plan will double the agency’s suppression efforts, with a focus on thinning overgrown trees and using prescribed fire to reduce dead vegetation.
Although prescribed fire is heavily used in North Carolina and other states throughout the Southeast, some areas could require additional measures to reduce wildfire risk, according to Scheller. In the southern Appalachians, for example, the terrain is rugged and oftentimes inaccessible. Additionally, the region’s wildland-urban interface is expansive, making it difficult for agencies to apply prescribed fire safely.
“It will really come down to having more resources nearby to fight fires when they emerge,” Scheller said. “A lot of the western states have bulldozers, trucks and planes ready at the beginning of summer. We don’t have nearly as much of that.”
Scheller added that educating the public about the dangers of accidentally igniting fires — and the benefits of prescribed fire — will also be necessary to mitigate wildfire risk across the Southeast and other regions. Research shows that wildfire prevention education can reduce wildfire-related losses and suppression costs.
“Equally important is educating the public about the benefits of prescribed fire, that not all fire is bad and that the right fire in the right place is a long-term solution to reducing fire risk and restoring native habitat,” he said.