February 23, 2016

The National Weather Service's New Supercomputers are Operational!

There has been a flurry of media attention the last few days for a truly positive development:  the National Weather Service now has two state-of-the-art supercomputers dedicated to weather prediction.

For years, I have complained in this blog about the National Weather Service gravely lagging in computer resources;  three years ago, its weather computers had one-tenth the capacity of the European Center for Medium Range Forecasting (ECMWF).   No longer.  We are now modestly ahead of ECWMF.

One of the two new weather supercomputers

The two new supercomputers, called Luna and Surge, were purchased from a Seattle-based company, CRAY Inc.  Large XC-40 models.  Each runs at 2.9 petaflops (a quadrillion floating point operations per second).  One machine handles the operational weather prediction models, the other is for backup and research.  Each of these machines has nearly 100,000 processors. Impressive.

The implications of these new machines are substantial.  With 3rd class supercomputers, the U.S. models were run at relatively coarse resolution.  Our data assimilation systems (how one uses observations to create a physically realistic 3D description of the atmosphere) were 10-20 years behind ECMWF.  The ensemble systems we used were too small, producing inferior probabilistic forecasts.  Quite honestly, it was embarrassing that the nation with the largest weather research community and which had invented numerical weather prediction, allowed its weather computer resources to lapse in such a profound way.


But the embarrassment of inferior forecasts (such as Superstorm Sandy), encouraged Congress and NOAA management to finally deal with the situation and the new computers were one result.   NOAA leaders such as Kathy Sullivan and Louis Uccellini have championed the new machines.

But to get maximum advantage from these new computers, their immense capacity needs to be used wisely.  I have strong opinions on what wise use would mean, based on studying this issue for years and from my serving on a national advisory committee to the National Weather Service (UMAC).   The computers could provide:

1.  High resolution convection-allowing ensembles over the continental U.S., which could radically improve forecasting of thunderstorms.

2.   Much higher resolution global ensembles (at perhaps 15-20 km), which would greatly improve probabilistic global prediction.

3.  Far better data assimilation, including enhanced use of satellite imagery.

But let me be honest:  I have my concerns.   There are some folks in the National Weather Service,who want to hold on to outdated legacy systems (such as their Short-Range Ensemble Forecasting (SREF) system and their poorly performing NMMB modeling system).  There are plans to add an unproven NAM-RR system (Rapid Refresh system using the inferior NMM model) that will waste huge amount of resources. Others want to run the poorly performing Climate Forecast System (CFS) model out to 15 months (currently only 9 months). These computers should be seen as an opportunity to clean house and modernize and rationalize the National Weather Service modeling suite. An opportunity that should not be missed.


Quite frankly, even with these new machines, the National Weather Service is still short of computer power for providing the nation with state-of-the-science weather prediction.  The European Center only does global prediction, while the National Weather Service does global, regional, and local prediction.   Certainly, the NWS could profitably use 5-10 times more computer resource and the payback would be substantial.

The National Weather Service will soon decide on its new global model.  There are two finalists:  one (the NCAR MPAS model) would allow the NWS to combine forces with the vast U.S. research community to produce a superior modeling system.  The other (NOAA GFDL FV3) would lead to isolation for decades, and the use of an inferior system at higher resolutions.   I will blog about this critical decision soon.

But no matter what happens, these new computers will result in substantial weather forecasting improvements in the U.S., something that all Americans can note with some satisfaction.

Announcements

Weather Forecasting:  Humanity's Greatest Achievement?

I will be giving a talk on March 16th at 7:30 PM in Kane Hall on the UW campus on the history, science, and technology of weather forecasting as a fundraiser for KPLU.  General admission tickets will be $ 100 and VIP tickets that include dinner are $1000.  If you are interested in purchasing tickets, you can sign up here.   

Northwest Weather Workshop

The big local weather workshop is less than a month away (March 4-5, Seattle).  If you are interested in attending, the agenda and registration information can be found here.  This gathering is the place to be if you want to learn more about local weather research and operations.

10 comments:

  1. That server room looks like converted cube farm space.

    ReplyDelete
  2. Cliff all we really want to know is when will there be more fresh powder! We need a few more days of big mountain snow! See what you can do for us!

    Great blog by the way!

    ReplyDelete
  3. The Cray® XC40TM is impressive indeed. I checked its specifications (http://www.cray.com/Assets/PDF/products/xc/CrayXC40_SoftwareEnvironment.pdf). The design scales to at least 200,000 processors. It runs the open source Linux operating system. I'm looking forward to any future reports you may publish on this.

    ReplyDelete
  4. Excellent computer choice!

    I always think of Seymour Cray digging his tunnel in Chippewa Falls, Wisconsin.

    ReplyDelete
  5. In other news, Seattle Times says The Blob will be return this year:

    http://www.seattletimes.com/seattle-news/environment/skimpy-skagit-salmon-run-blamed-on-blob-in-the-ocean/

    Thoughts?

    ReplyDelete
  6. Ted G... I don't think anyone is saying the Blob is returning.

    However, the Blob conditions when the Blob year's young salmon went to sea have hurt them. Recall that salmon go to sea for some number of years (depends on the type), and bad conditions when the fingerlings first go to sea are crucial to how that run of salmon are going to look when they finally come back to spawn.

    That can create an echo in future years because low return rates to spawn during one year will create another poor 'echo' run the next time around due to reduced fingerlings. Maybe several times.

    The good news is that the salmon going to sea this year might do well. And if we shift to a La Nina period next year or the year after, as usually happens after a strong El Nino, things could get very good for salmon.

    ReplyDelete
  7. Encouraging news. tThe one line that got my attention and qualifies as Cliff's understatement of the year: " I have strong opinions..."
    With a wink.

    ReplyDelete
  8. bad conditions when the fingerlings first go to sea are crucial to how that run of salmon are going to look when they finally come back to spawn.
    Agen Judi Online Terpercaya
    Berita Bola

    ReplyDelete

Please make sure your comments are civil. Name calling and personal attacks are not appropriate.

Potential for King Tide Flooding this Week?

  King Tides occur when the Earth, Sun, and Moon are aligned and can be particularly high during the winter months when the Earth is closest...