Tuesday, August 18, 2015

Are More Wildfires the "New Normal"?

UPDATE:  Here are the latest cumulative statistics of the number of acres burned over the Northwest by the National Interagency Fire group.   After a low fire season, we are finally at normal...and according to their projections, we should stay that way, helped by incoming cooler weather.

The media has a tendency for pack behavior.   And the current favorite is to suggest that the fires hitting the Northwest is "the new normal."

That we can expect years in our immediate future to be like this year because of anthropogenic global warming.   With all the smoke and fire around eastern Washington and Oregon it sounds compelling, right?

The trouble is that it is clearly not true.

There is no reason to expect that greenhouse gas-caused warming will produce more fires during the next few decades, although the story changes as we get to the end of the century.  Let's look at the situation more carefully.

There are a variety of ways we can show why the "new normal" hypothesis makes no sense.

As I have discussed in several of my blogs, the last year has been crazy warm, a huge anomaly from normal.  If the area was drying and warming due to the steadily growing greenhouse gas concentration in the atmosphere, one would expect a long-term trend in temperature, dry surface conditions, and snowpack.
But no such trend exists.

Let's start with snow pack, since a lot of the media have been claiming that low snow pack is a major element of this year's fires (something my friends in the wildfire community do not support by the way).  Here is the amount of water in the Northwest snow pack (SWE) for the last 30 years.  The blue lines are the annual value and the red is a smoothed curve of this data.

The snowpack on April 1, 2015 was the lowest on record.  But it is an outlier and there is little apparent downward trend over the past three decades.  This is not the kind of variation you would expect if global warming was the culprit.

What about temperature?  Here is a plot form the NOAA "Climate at a Glance" website showing May-July temps over Washington State.  2015 is the warmest, although 1958 was close.   But there is really very little upward trend.

 Let me show highlight by getting rid of 2015 and before 1925 when the record was not reliable.  You see much trend?  Not really.

What about the trends of July temperatures over the eastern slopes at the Cascades, say at Ellensburg and Wenatchee from 1925-2014?   Here they are.  VERY small warming trend.

So this year does NOT represent the culmination of a trend toward heat waves and low snowpack, but a huge anomaly, one that is associated with natural variability.  

We can come to the same conclusion from basic physical understanding of the climate system.    Our extreme weather has been associated with a very anomalous configuration of the upper level flow, with a huge, persistent ridge over the West Coast and a trough over the east.  The eastern U.S. has been far COLDER than normal.

As I have discussed in previous blogs, there is no reason to expect such an amplified wave pattern with global warming.   Global climate models forced by increased greenhouse gases don't show this behavior.  Careful studies of changes in the upper flow during the past half century do not show it.  Theoretical work suggests that this hypothesis makes no physical sense.   One or two authors have pushed this idea ("the lazy jet theory") but their work has been thorough disproved in the peer-reviewed literature.

Picture of the Chelan Fire, Courtesy of the Seattle Times

The bottom line is this:  the extreme snowpack and temperature conditions this year was a huge anomaly driven by natural variability.  It is not a "new normal".  Next year, even though it will be an El Nino year will undoubtedly be less extreme.

Global warming WILL be a serious issue for the Northwest, but that is in our future.   This year gives you a taste what the temperatures and snowpack will be like at the end of the century.  The cause will be different, but the end result will be similar.  We must prepare for the conditions of the end of century, but we will have decades to do so.

Is there an upward trend of wildfires in the U.S?  The answer is no.  Here are the latest year to date wildfire numbers for the U.S..   Actually less fires this year, but acres are up...but there is no trend.

A graph from the Seattle Times showed the same thing.  Interestingly, the ST had an updated version in the print version and online...but the online version disappeared.  I won't comment on why they removed it.

Regarding the eastern Washington wildfires, there is another issue.  The eastern slopes of the Cascades have burned for millennia.  Wildfire is a central part of the ecosystem there.   Too many folks have moved into the hills above Chelan, Wenatchee and elsewhere, putting themselves at risk.

For example, go to GoogleMaps and explore around Chelan area with their satellite option.  They are many homes up in to the hills, located where no one should be (see below).   Just as folks should not build homes near frequently flooding rivers, no one should build homes were frequent fires have raged over thousands of years.

Another issue, mentioned by Dave Wilson in the comment section below, is poor timber management practices along the eastern Cascade slopes.  Contemporary eastern slope forests are very different than the originals, which were far less vulnerable to massive crown fires.  And, of course, having large numbers of people throwing cigarette buts and fireworks into a flammable environment is hardly natural.


Rod said...

Great post, Cliff, and I agree.

The only new normal I am worried about is my aching back during gardening season. The chiropractor can no longer get rid of the pain and stiffness in a couple of visits. Perhaps 64 years of age has something to do with it, you think?

Jarv said...

I'm a retired educator. In the nineties the state decided to pay us according to how many credits we earned after getting our degree. I took a five day class developed by a former park ranger. As we sat around the campfire discussing various NW trees and drinking Bud, he told us about the wildfires in Yellowstone and recommended Alston Chase's book Playing God in Yellowstone. The instructor said it was just a matter of time before the forest surrounding Stehekin would go up in flames due to the policies about putting out fires. Forest fires are naturally occurrences that actually promote new and healthy forests. I don't mean to be unsympathetic about the lost of homes in the Chelan area but we are playing God by putting out naturally caused forest fires. I do wish them well.

Josh S. said...

I'm greatly relieved that there's no association between a warmer atmosphere and the prevalence of this wave train pattern. I realize there's a lot of misinformation in the media, but one good effect that can have is that people will be more concerned about global warming. Also, we can't exactly pinpoint when in the future 2015's PNW temperatures will be normal, right? I think it depends on GHG emissions in the next few decades.

Jason Jablonski said...

Great post Cliff, enjoyed reading it. Have another question for you though. Have you read anything further about this whole "farmers almanac says a rough winter vs the blob"

Where do you stand on that topic, and what is the farmers almanac exactly?

Bob said...

Someone crying "tipping point!" in 3... 2... 1... >;^}

re: cool summer in eastern North America - there is still a surprising amount of ice floating in southern Hudson Bay.

Jim said...

One of my colleagues sent me a 1890's era photo taken of the north shore of the Columbia River in the general vicinity of Fort Columbia. Towering over an understory of younger trees (indeterminate age but maybe 50+ years?) are the immense & huge hulks of dead trees (likely Sitka Spruce). The implication being a massive fire in the unknown past (early 1800's?). Scarborough Head was devoid of trees on its SW slope in 1805. Natural (lightning) and human (Native people's and later settlers) started fires that burned out of control were likely very common in the past.

Dean E Kurath said...

Great article. I'm relieved that this last year will not be the new normal. However I am not keen to buy a season pass at the ski resort just yet.

Question: If the temperature hasn't gone up and the snow pack has been steady, then why are all of the glaciers in the state receding?

iron said...

however, cliff, you have repeatedly written that this summer will not be a bad fire season because the moisture content of the soil was above normal. is that still holding true? do you still think moisture content correlates to fires? or, do you think a snowpack that lasted through most of the summer (like it should) would've been better?

any hiker could have told you early on that this summer was going to be bad - very bad. there was so little snow below 6,000ft, the writing was on the wall.

Cliff Mass said...

You misunderstood me. I never said the fire season would not be bad. I did say that the snowpack deficiency was not very important (and note that the big fires have been generally at lower elevations) and that we started the season with decent soild moisture, which was true. We did not have a big early season of wildfire in other region..until a few weeks ago,acres burned were well BELOW normal...cliff

Cliff Mass said...

The glaciers have been declining over the past century because temperatures have warmed modestly. The melting really started during the last 19th century..cliff

Dan McShane said...

Cliff: Many of the North Cascades glaciers went through an advance period from the late 1940s to early 1970s. You are correct that there was retreat from "little ice age" advances from the late 1800s into the mid 1900s. Since the late 1970s the retreat of glaciers in the Cascades and Olympics has been very rapid despite the no trend in April 1 snow pack. For the more intensively studied glaciers there was net loss even during most high April 1 snow pack years. It is obviously complicated and temperature data from high elevations is a bit short particularly over the long run to have any confidence on the temperature change trend in the glacial areas.
I agree that fire ecology and behavior is complicated stuff and assigning a single blame is ignoring that complexity. The June- August temperature trends might be of limited use. The summer temperatures are important for how fires that do start will behave but would be more interesting is how frequent temperatures exceed say 90 degrees. Is there a trend there? I would also think that the average temperature trends for February through May might be more important. The reason early concerns were raised about the fire season was that this spring was on average 4 to 5 degrees warmer than average over every month but April.
Its the drying out period that should be of concern in terms of length of fire season and the frequency of very warm weather during the fire season which can make fires uncontrollable. Just suggesting if you want to weigh in on trends and climate a few more things to look into.

graypdx said...

Being a regular reader but first time commenter, I must say that while I appreciate many of the detailed analysis, I am surprised by the change of tone regarding this year's fire season. Now, I know that you never said that this year would not be a bad fire season, but you definitely did harp on media outlets for suggesting that this was poised to be a bad year. Now that is has become a bad year, instead of any sort of follow-up on your previous fire post (which did mildly suggest there was nothing abnormal to worry about because of a brief period of above-average precipitation in May) you are finding yet another media target to lampoon.

Now you are saying that media shouldn't be calling this the new normal for fire. Your evidence is the lack of a temperature warming trend. But temperature is only a small component of fire. The most important factor, and the reason that we may in fact be heading to a new normal for fire, is the fact that we have been practicing fire suppression for over 100 years, leading to a massive increase in fuels. So even with no warming as you suggest, fires that start (which they inevitably do) will tend to be much larger.

The problem of course, is that we may be entering a new normal for fire - based on a combination of fuels accumulation and slow climate change (which as you say, has been enough to cause substantial glacier retreat). Are you trying to say that we are not entering a new normal for fire? That we will not see continued intense and widespread fire activity in the PNW? Or are you saying that media outlets are not providing enough information to make such predictions (that only thorough, detailed analyses are sufficient?)?

Furthermore, I understand your contention that climate change in the PNW has been slow and likely not all that detectable. But some of the data you are presenting is pretty skewed and incomplete. Case in point is the fact that in this post you have presented several historical temperature graphs starting in 1925. That is, right before the start of the previously hottest 20 years on record. This is akin to the GW denier movement suggesting that global temperatures have not risen in the past 15 by using (super hot) 1998 as a starting point - disingenuous at best. I am beginning to feel like this "science" website is trending a bit more towards advocacy.

sunsnow12 said...

This takes real courage to post this given the current political climate and the media frenzy. Just an outstanding post. Once again my hat is off to you Cliff, not only for sharing your knowledge and expertise, but for having the kahones to speak the truth.

Kevin b said...

This article may be of interest to you as it discusses how climate change may be lengthening the fire season.


Mark said...

With regards to Seattle average temperatures. There is a clear upward trend and I can demonstrate it.

Your averaging period of one year is too short. Difficult to see with your graphs.

NOAA has created a wonderful online data base: http://www.ncdc.noaa.gov/cag/time-series/us which I see you have also made use.

I ran the database for Seattle 1948 - 2014 for annual average temperature. Next, I summed the annual anomaly for 10 year intervals: 1955 - 1964, 1965 - 1974, 1975 - 1984, 1985 - 1994, 1995 - 2004, 2005 - 2014. A total of six ten-year intervals.

Results for the sum of the anomalies and the average anomaly for 10 year intervals.
Interval Sum Avg
1. -8.5, -0.85
2. +2.8, +0.28
3. +4.5, +0.45
4. +9.2, +0.92
5. +6.8, +0.68
6. +9.2, +0.92

Clearly, the most recent 30 years (1985 - 2014 average anomaly +0.84 F) were warmer than the previous 30 years (1955 - 1984 average anomaly -0.04 F) a change of 0.88 F.

When 2015 concludes and I rerun the data for 2006 - 2015 it will easily surpass 0.92. Assuming 2015 is at least as warm as 2014 that will produce a ten year average temperature anomaly of 1.14. It will be the warmest decade of the record.

BTW the sum of the anomaly for the first 10 year interval of record (1948 - 1957) is -20.2 (average -2.02)

When you compare the first 10 year interval of record to the most recent 10 year interval the change of the average 10 year anomaly is +2.94 Degrees F.

Seattle is much warmer today than it was 65 years ago.

Wish I could add graphs and tables to the comments.

Greg Metcalfe said...

Lots of trend mentions, but in no case does a plot showing a trend line (and confidence interval) appear. In most cases, for this post, the Mark 1 eyeball is good enough, and I recognize that highly technical plots, with explanations, might put many people off. But really, no trend line _at all_?

jeb Thurow said...

With all of the talk of gloom and doom on the news about our weather I thought I would look at the drought monitor and what history says about what's happening. I included a link to a article from 2011 that I thought was very interesting. Since I don't have accesses to university library I was limited to what I could access. I was hoping you could comment on if this is a more (or less) plausible explanation. http://www.sciencedaily.com/releases/2011/02/110222122725.htm
6,000-year climate record suggests longer droughts, drier climate for Pacific Northwest
February 23, 2011
University of Pittsburgh
Researchers extracted a 6,000-year climate record from a Washington state lake showing that the American Pacific Northwest could not only be in for longer dry seasons, but also is unlikely to see a period as wet as the 20th century any time soon and will likely suffer severe water shortages.

Mary Sorman said...

Cliff, do you and your collegues need to keep neutral on these matters( of fires) because of political back lash?
I understand its not one simple answer and I've gathered some good information from your posts.
It seems too many people are living in our forests and we don't allow natural burns to occur. It's one of the many reasons for a changing planet. If you could address how our industries change weather, I'd appreciate it: fracking, oil extraction

Michael Sweet said...

Tamino shows that the increase in temperature in Washington state has been statistically significant over the last 100 years. Your claim that temperatures have not increased much is false. He has other citations on his site of peer reviewed literature that show the increase in large fires in the Western USA is related to climate change. If you refer to peer reviewed papers instead of speculating without data analysis you will be more believable.

Jim said...

Tamino observes:

Over the last few decades there has been quite an increase in acres burned by wildfire in the U.S. And it is not due to factors like land use or fire control practices. Those factors have an effect to be sure, but the science has been studied in detail and the result is clear: the real cause of the tremendous increase in wildfire burn area is man-made climate change: global warming. The increase is, not to put too fine a point on it, "statistically significant."

ginnaville said...

Big fires are a big business these days. I am sad that anybody has to lose a home, business or place of work. Thirty years ago there was not anywhere near the current level of resources to be thrown at large fires. Sixty years ago, even less. Ninety years ago, they just let the remote ones burn. DC-10 retardant drops on residential areas, US Army deployments, caterers, mobile showers and mutual aid from the City of Seattle was non-existent, or very limited. (The SFD has dispatched apparatus and personnel to Fruitland and Conconully this week.) As the amount of protection increases, the amount of construction and sprawl rises. Or, vice versa. The National Interagency Fire Center (nifc.gov), source of black graph in this post, keeps track of a lot of statistics. Definition of a fire is .1 acre (45' X 100') or more in any of the jurisdictions tracked. Acres burned looks very large, but keep in mind some of that is grassland. Over five million acres this year are from Alaska alone.

CC said...

Iron is right, anyone who spends a lot of time in the mountains knows that everything is 1 to 1.5 months early this year: the meltout, the growth and die-off of herbaceous plants, the huckleberry season, the start of color change and drying of the leaves of woody plants, the general drying up of everything. As for soil moisture, I have been building trails for 20 years at one location west of the crest and have never seen it this dry. We all saw this coming.

As for your "friends in the wildfire community" telling you early meltout is of no relevance for fire season, how about some references. My friends in the wildfire community tell me just the opposite.

Colleen said...

Too many people are building where they shouldn't build ~ YES! Thank you for including this oft-unspoken but relevant point. And not only do they put themselves at risk, as you say; they also increase risks for those willing to fight the fire.

David R said...


Re: "I never said the fire season would not be bad. I did say that the snowpack deficiency was not very important (and note that the big fires have been generally at lower elevations)..."

Melted snow (water) would have a tendency to run downhill towards the lower slopes, right?

claimsguy said...

There are some who disagree with your analysis. https://tamino.wordpress.com/2015/08/19/cliff-mass-picking-cherries-in-full-denier-mode/

Matt M said...

> "Is there an upward trend of wildfires in the U.S? The answer is no."

Are you sure? You show 10years of "year-to-date" data after this. Is this the entire basis of your conclusion? Or are you only talking about the number of wildfires (not acres burned)? I can see a number of ways in which you could argue you are correct, but I don't think you could argue that it is likely to mislead.

1) AR5 says "since the mid-1980s large wildfire activity in North America has been marked by increased frequency and duration, and longer wildfire seasons"

The following trend is detected but not attributed "Increases in wildfire activity, including fire season length and area burned by wildfires in the western USA and boreal Canada"

2) The source you use clearly shows an upward trend in acres burnt using just about any timespan you want. Number of wildfires does not.


3) The first paper I came across which has long-term trends (but only looks at the west) "found significant, increasing trends in the number of large fires and/or total large fire area per year." Abstract below.

"We used a database capturing large wildfires (> 405 ha) in the western U.S. to document regional trends in fire occurrence, total fire area, fire size, and day of year of ignition for 1984–2011. Over the western U.S. and in a majority of ecoregions, we found significant, increasing trends in the number of large fires and/or total large fire area per year. Trends were most significant for southern and mountain ecoregions, coinciding with trends toward increased drought severity. For all ecoregions combined, the number of large fires increased at a rate of seven fires per year, while total fire area increased at a rate of 355 km2 per year. Continuing changes in climate, invasive species, and consequences of past fire management, added to the impacts of larger, more frequent fires, will drive further disruptions to fire regimes of the western U.S. and other fire-prone regions of the world."


Jim said...

With Climate Change, a Terrifying New Normal for Western Firefighters

This Yale e360 video, "Unacceptable Risk: Firefighters on the Front Lines of Climate Change," produced by The Story Group, focuses on the people battling to save lives and property in a rapidly changing environment.

"We’re being asked to battle fires that didn’t exist 20 years ago,” says veteran firefighter Don Whittemore. “We’re seeing a level of fire and an intensity of fire and a risk to firefighters that hasn’t existed in the past. On a day-to-day basis we’re being surprised — and in this business, surprise is what kills people."

Gene Riddell said...

A huge anomaly? Well Cliff I hope you are right and your statistics are impressive but my human instinct or gut feeling says otherwise. In any case I enjoy reading your blog.

Dalton said...

Those living in glass houses on the Cascadia fault should not throw stones. The homes that have burned in the last two years, in Pateros, in Wenatchee, and in Chelan have been near major waterbodies and surrounded by grass and sagebrush, not in the overstocked forests. Extreme fire behavior is to blame much more so than irresponsible development.

richard583 said...

.. No. If with allowing for a definition of "normal", covering a timeframe more than five years. ....

Unknown said...

Cliff - You are only looking at this year but not the upward trend in fire seasons over the last few decades. Your comparison of this year versus the "normal" is skewed since the normal is always being averaged upward.

I was a wildland fire fighter from 1994 to 2000. Every two years (even years) was the worst fire season on record (except 1998, if I recall correctly). It has only become worse since. There are factors of a warming climate at work here; such as bark beetles moving further up slopes as temperatures increase and killing more trees, etc.

Yes, this year is a huge anomaly. That doesn't exclude global warming on the overall fire seasons. Another anomaly? Very little lightening in the northwest this year (until recently) which is the major reason the burned acreage is within normal bounds.

JeffB said...

Nice to see a real scientist like Cliff Mass letting real science lead where it may vs. a fake scientist, actually just a blogger Tamino, aka Grant Foster with a broken scientific compass that always leads in the same alarming direction.

Steve Haffner said...

Regarding Figure 1, which shows cumulative acres burned by day of year. How is it possible for the average number of acres burned to take a dip in mid-September? Seems the slope of the curve should never be negative.

eprman said...

Cliff - I am interested in the discussion about glacial retreat as mentioned in a couple of comments. The data I find on the internet for temperature over the past 100,000 years or so, seems to show that since the earth entered this interglacial period about 8,500 to 10,000 years ago the temperature variations have been relatively small, on the order of 1-2 degrees C. These fluctuations would have caused some advances and retreats of glaciers but does not seem large enough to have created the large glaciers around the world. So it would seem to me that the on going glacial retreat is a continuation of the melting of glaciers since the warming that followed the last ice age. The level of world wide temperature is just too high to sustain the glaciers created during the last ice age. I understand that glacial retreat has accelerated over the past 30-40 years however the retreat would have happened without man induced warming, but at a slower rate. I would like to see your thoughts.

Ted Conroy said...

Hey Cliff I am a little confused about this:

As I have discussed in previous blogs, there is no reason to expect such an amplified wave pattern with global warming. Global climate models forced by increased greenhouse gases don't show this behavior. Careful studies of changes in the upper flow during the past half century do not show it. Theoretical work suggests that this hypothesis makes no physical sense. One or two authors have pushed this idea ("the lazy jet theory") but their work has been thorough disproved in the peer-reviewed literature.

are you referring to this paper