May 15, 2013

A New Chapter for U.S. Numerical Weather Prediction

Major news to report.

The National Weather Service will be acquiring a radically more powerful computer system during the next year, one that could allow the U.S. to regain leadership in numerical weather prediction.  Used wisely, this new resource could result in substantial improvements in both global and regional weather predictions.

Using 24 millions dollars from the Superstorm Sandy Supplemental budget, the National Weather Service will be acquired two computers with a capacity 37 times greater than it uses today.  We are talking about a transition from 70 teraflops right now (and 213 teraflops this summer) to 2600 teraflops in 2015.  (A teraflop denotes a trillion calculation per second).    Such computations are spread over tens of thousands of processors.

This new system would give the National Weather Service world-class computer resources and should nudge its Environmental Modeling Center a bit ahead of the current gold-standard weather prediction entity, the European Center, in raw computer power...the essential requirement for weather prediction.


Although this is unalloyed good news, one should note a few important facts:

1.  The National Weather Service does FAR more than the European Center, which only runs global models.   The U.S. has done an inadequate job in regional and national prediction, most acutely in running high-resolution ensemble forecasts--which need to be at 2-4 km grid spacing, not the current 16 km.  My back-of-the-envelope estimate is that the NWS needs at least ten times more computing power than even this new acquisition will give it to be truly state-of-the-art BOTH globally and locally.

2.  The new computer only gives the NWS the potential to be the best.  It needs to use the best approaches for data assimilation, model physics, and use of observations, which often it is not now.  In the past there have been all kinds of excuses about lack of computer power.  Excuses are gone now.  And the NWS needs to develop a closer and more interactive relationship with the research community, something its has failed at in the past.

3.  Even the creaky, small computer they use now has been applied inefficiently and wastefully.  This kind of approach, with lots of legacy products, old models, and lack of cost/benefit analysis, needs to be changed.  For example, a huge amount of the current computer time is used for four times a day runs of the Climate Forecast System (several month simulations using the global GFS model).  This makes little sense..why four times a day?  And why run their global model (the GFS) to 16 days, four times a day?  The European Center doesn't!.   Perhaps do so twice a day, with shorter runs (192 hours) for the other times.

4.  The NWS needs to use other available computers more effectively for operations.  For example, there is the huge NOAA Fairmont machine that is available for NWS use.  Move over less time-critical operational runs, such as the Climate Forecast System runs noted above.

I can provide many other examples of inefficiency and waste in the current usage.

You think that the new computer is so big that we don't have to worry about efficiency?  Think again.
If you want to double horizontal resolution in a weather prediction model (and we REALLY want to do this), you need roughly EIGHT TIMES more computer power.  There is a reason that numerical weather prediction requires the most powerful computers on the planet!

I will end by noting that this huge improvement did not occur because NOAA management had planned carefully and worked to garner the necessary resources over time.  They have irresponsibly let U.S. numerical weather prediction and the NWS slide during the past decade, and Congress has not been sufficient attentive to the problems.   This great advance occurred due to the intense hue and cry by the meteorological community, users of weather information, and the media.  Blogs and newspaper articles documented the deficiency, and private sector companies have complained about paying exorbitant fees to the European Center to get state-of-the-art forecasts.   It shows the power and influence of the public and the weather community when they can document both the need and deficiency, and push their case with the new communication tools of the 21st century. 

Without Hurricane Sandy we would have the same old computer!

And it took a great disaster, Hurricane Sandy, to display the decline in U.S. numerical weather prediction in a concrete and compelling way.  U.S. weather prediction can now move on a new and better road if NWS and NOAA leadership are willing to follow it.



3 comments:

  1. Good news for a change on this front. Hopefully this bodes well (also.) for the rest of the field looked at more broadly.

    ReplyDelete
  2. Cliff,

    Excellent blog as usual. As you know, I would rather they focus first on mesoscale modeling and then on the medium-range models (i.e. the European). Because of the amount of extreme weather in the U.S., and because so many important features are mesoscale, we can wait a few more years to catch up to the Europeans at Day 5 and beyond. We can't wait in the mesoscale.

    Mike

    ReplyDelete
  3. I would like to apply some pressure to help resolve some of these resource and management issues at NOAA. Who do we talk to make this happen? Specific Congresspersons, Senators, or...

    Thanks for what you do Chris, love your blog.

    ReplyDelete

Please make sure your comments are civil. Name calling and personal attacks are not appropriate.

Are Eastern Pacific Cyclones Become More Frequent or Stronger?

 During the past three days, I have  received several calls from media folks asking the same question:  Are storms like this week's &quo...