July 15, 2018

The Technology That Can Provide Society with Actionable Information For Dealing with Global Warming

The long-term impact of global warming is one of the key issues of our time.

Greenhouse gases are increasing rapidly and it is becoming increasingly clear that mankind will not significantly reduce emissions during the next few decades.
Global climate models, forced by increased greenhouse gases, suggest major changes in the Earth's climate and weather regimes, especially by the middle to end of this century. 

But these models do not provide actionable informationWhy? 

(1)  Because global climate models do not possess fine enough resolution to describe terrain, thunderstorms, and other local effects that can be critical for determining future climate change in many areas (like the Northwest).

(2) Global climate models do not necessarily agree on even the large scale impacts of climate change.    And there is also uncertainty regarding how much greenhouse gases will increase during this century.



Although global climate models have issues, society still needs future climate impact information. 
  • Infrastructure must be built or adapted to deal with changes in climate-- with examples including dams, reservoirs, drainage and water systems, coastal and riverside facilities, and more.
  • Our management of the environment, such as forests, wetlands, and the Sound, may need to be altered with upcoming climate change in mind.
  • And we may well have to alter where we live, build, and do agriculture.
Clearly, we need reliable information on how our local weather and climate will change as greenhouse gases increase.  Information that provide both our best estimates of what will happen and their uncertainties.   And we don't have it.

This blog describes a proposed effort designed to provide the necessary regional climate forecasts.  One based on state-of-science modeling that is both high resolution and probabilistic and takes advantage of the latest modeling and scientific advances.


But to make this effort a reality will take resources, both in terms of personnel and computer time.

This blog proposes the development of a regional climate modeling center and is a call for the support needed to make it a reality:  from local governments, interested local businesses, wealthy individuals, large numbers of modest investors, or perhaps a foundation.

Building a Regional Climate Prediction Effort

Global models can get large-scale features correct, but can't deal with critical, but smaller scale, local terrain, surface, and water features.  How do we solve this problem?  

Regional Climate Models (or RCMs).

The idea is to run high-resolution RCMs on small domains over an extended period, using the global models to drive the boundaries of the small RCM domains.  This is called dynamical downscaling. The high-resolution domains are small enough so that the computer resource requirements are reasonable, but high-enough resolution to get the local features correct.

Several efforts are doing this, including a group of us at the University of Washington.   Let me show you an example.  On the left below is the forecast of winter (DJF) temperature change predicted for the end of the 21st century using the ECHAM5 global climate model, assuming a continued rise in greenhouse gases.   No local details at all and doesn't look very realistic.  
And on the right is a simulation from a regional climate model driven by the global model.  You can see the influence of terrain, with areas of very large warming due to melting snow on the slopes.  Much more warming and undoubtedly more realistic.
Only  a high resolution regional climate model can realistic predict the reduction in snowpack on our local terrain, since the global models lack even our major mountain ranges (e.g., the Cascades and the Olympics). 


Long experience by my group and others suggests that a regional climate model must run with a grid spacing (distance between the grid points) of 12-km or less to start to do a reasonable job with Northwest terrain.

But then there is the uncertainty issue.  You can't simply complete one regional climate run and go home (which several groups have done in the past).   Just as in weather prediction, you must run a collection of high-resolution runs (called an ensemble), each starting slightly differently and each using somewhat different physics (how we simulate processes such as radiation, condensation, precipitation, etc.).     And we need to so a variety of runs with different amounts of greenhouses gases, since there is uncertainty of their concentrations in the future.

So an ensemble of regional climate runs is necessary to provide a reasonable estimate of regional climate uncertainties and to allow the calculation of probabilities of potential outcomes.  One also needs to complete statistical calibration (more on this later).

Our results so far

No group has attempted to create a large regional climate model ensemble, until a group of us at the UW began such an effort.  Our initial funding was mainly from Amazon, who provided 18 months of some staff support and tens of thousands of dollars of time on their cloud.  Amazon gave us a good start, but that funding has run out and our project is running on fumes now.

But we have gotten far enough in to give you a taste of the power of the approach--four of the regional climate runs are complete.  So let me give you a view of Northwest climate change that no one has seen before.

Here are four high-resolution regional climate model forecasts of total winter (DJF) precipitation at Seattle, driven by four major global climate models from simulations running from 1970 through 2100.   Black dots indicate observed values.  The model values are within the spread of the observations during the contemporary period...a good sign.  The future?   A slow, but modest increase in winter precipitation.  Good for water resources if we can store it.

How about winter temperature?   No much change between 1970 and now (consistent with observations), followed by a slow rise starting in the 2030s, with temperatures at the end of the century up by about 4-5C (about 8F).  Seattle will have much more pleasant temperatures in the winter, but that has a down side:  reduced snowpack.


Our goal is to have at least 12 of these runs done by the end of the year if we can find the funding.

What needs to be done

To provide the best possible regional climate forecasts, we need to run 30-50 regional climate simulations for the Pacific Northwest, using a full range of climate models, greenhouse gas scenarios, and variations in model physics.  We know how to do this.   And once the runs are complete we need to complete statistical calibration, including fixing biases evident over the contemporary period where we have observations (such as 1970-2015).

This is all doable within 1-2 years, but it will take resources.  Support for 2-3 personnel.  Substantial computer resources.  But doing so will give infrastructure planners in the Northwest extraordinarily valuable information and greatly assist in increasing the resilience of our region to upcoming climate change.  You can't plan for what you don't know about.


So how can we get the resources to make these regional climate simulations a reality?

Might a regional or national foundation provide the assistance?  Or a wealthy individual?  If so, please contact me.

We have appealed to state and local agencies, talking about setting up a regional climate change prediction consortium, modeled after our very successful regional weather prediction consortium.   So far only limited interest.

Individual donors can also help maintain our current efforts at this UW website.


But somehow, we must find a way to make this happen. There is so much talk about climate change,  even a carbon initiative on our upcoming ballot.  Is it not amazing that the investments have not been made in securing the best possible information regarding the impacts of climate change on our region?

Finally, I have a prepared a video that goes into more detail about the necessity and scope of the proposed effort:

12 comments:

  1. Has your team considered using distributed computing via the Berkeley Open Infrastructure for Network Computing (BOINC) as an economical way of running your regional climate models? Much like the SETI@home project, the calculations would be completed by using volunteer computer resources.
    https://boinc.berkeley.edu/

    I for one would jump at the opportunity to run a project such as this on my computer!

    ReplyDelete
  2. SETI and other large-scale projects make use of software to spread work units out: https://boinc.berkeley.edu/

    The coding is beyond me, sadly, but there's no shortage of programmers in Seattle.

    ReplyDelete
  3. So the models that can’t and don’t properly account for atmospheric structure and movement at the global level will work at the regional level? And 4-5C by the end of the century is going to be yet another wildly missed prediction. 30 years ago the predictions were that we would be up 4 C now..

    ReplyDelete
  4. What is the earth's optimal temperature?

    ReplyDelete
    Replies
    1. Optimal for what?
      Plant life, loves a CO2 rich atmosphere. Reptiles like it warm, mammals not too warm. C02 rich/ acidic oceans are favored by critters like squid, but are detrimental for many other species.

      Earth has "been there done that" many times, without humans.

      Delete
  5. The revenue source is right in your post:-initiative 1631!!! Perhaps you can work with proponents of the carbon tax to get some of the revenue spent on climate regional modeling.

    ReplyDelete
  6. What precipitation input data are being used for the runs? I have come to learn that "the agencies" are using only three lowland (banana belt) weather stations for Whatcom County 'drought' monitoring, when at least half of the county experiences substantially wetter conditions (at least two or three times as much, even in the summer).

    I do think your quest for detailed modeling is worthwhile - I hope you get the funding that you need. My hope is that really comprehensive empirical data will be fed in.

    ReplyDelete
  7. @Organic Farmer

    I would like to see the following bodies of literature:

    Studies of dangers of warming
    Studies of benefits of warming
    Studies of dangers of cooling
    Studies of benefits of cooling

    Then, we could make an informed decision about what might be in the earth's best interest.

    Have each of the four areas above been the subject of the same level of scientific interest, funding, and media exposure?

    If not, why not?

    ReplyDelete
  8. One of the biggest contributors to “global warming” numbers being so inflated? UHI, and misplaces sensors. Urban sprawl as well.

    ReplyDelete
  9. @Organic Farmer

    Objectively, making an educated hypothesis about the Earth's optimal temperature would require four bodies of literature:

    Studies of the dangers of warming
    Studies of the benefits of warming
    Studies of the dangers of cooling
    Studies of the benefits of cooling

    Have all four areas of inquiry received the same amount of scientific interest, funding, and media attention?

    If not, then why?

    ReplyDelete
  10. Leo knows that all 4 of his points have been studied extensively, but he's trying to post a 'gotcha' question with the assumption that somehow scientists haven't considered his points. There are whole fields of science devoted to those questions (especially looking at past climates) and scientists are of course not stupid or naive enough to overlook something so obvious.

    ReplyDelete
  11. @jayemarr & unknown: I believe these simulations are what's known as "closely coupled" in the high-performance computing world. Meaning that they require ongoing low-latency (reduced by a factor of thousands at minimum) communication between the nodes in a compute cluster, as the nodes work cooperatively during a run. So technologies like Remote Direct Memory Access, Infiniband, 3-D toroidal interconnects, etc., are used.

    BOINC was not designed for those sorts of problems--just the ones handled by a standard client/server Internet architecture. Computing, distributed via Internet, has done some amazing things, but sometimes you really do need a supercomputer.

    ReplyDelete

Please make sure your comments are civil. Name calling and personal attacks are not appropriate.

Should I Let AI Write My Blogs?

I spend a lot of time thinking about and writing my blogs.   Could AI apps do the work for me?   Could any of you tell the difference? Well,...