Wednesday, April 30, 2014

The U.S. Slips to Fourth Place in Global Weather Prediction, While a New Weather Service Supercomputer Has Not Been Ordered

It is with considerable disappointment that I note that the U.S. has now slid into fourth place in global weather prediction.  

Yes, the country that invented numerical weather prediction and the one that possesses the largest weather research community in the world is moving further back in the pack, with substantial costs to the American people.  And frustratingly, a powerful new weather supercomputer,  funded over a year ago by the U.S. Congress, has not even been ordered, even though it could radically improve U.S. operational weather prediction.

But first, the statistics, available from the National Weather Service's own verification site (the NWS modelers are not hiding anything, give them credit for that!).  Here is an evaluation of the skill of  major global weather forecasting centers for Northern Hemispheric prediction skill at 5 days for the 500 hPa level (about 18,000 ft) up.  The graphic shows the anomaly correlation (how well the forecasts correlate with an analysis, 100% is best) over the past month.   The order for best to worst:

1.  European Center:  .919
2.  UKMET Office:  .896
3.  Canadian Meteorological Center:  .881
4.  US GFS: .871

You will notice the European Center (red triangle) is uniformly very skillfu) and rarely has large drop-outs, periods, when skill drops precipitously.   But the U.S. National Weather Service's model, the GFS, has major drop-outs with substantial loss of skill.  Such periods would lead to far less accurate forecasts compared to the European Center.

Here is another way of viewing the 500  hPa skill, this time for the entire globe and for various forecast projections..  This graphics shows you the relative skill of other centers compared to the U.S. GFS model. Positive numbers indicate better skill than the U.S., negative worst.    The red line is the European Center...much better at nearly all times. The yellow...the UKMET Office...better for all available times (through 144 h).  Our hockey loving friends to the north (green line), better through 160 hr.


A major reason why the U.S. is failing behind is that the other centers are using far more advanced data assimilation or higher resolution, both of which require very substantial computer power, which the U.S.National Weather Service has been lacking.

But the costs of lack of computer power extends to local forecasting as well.   Big damaging thunderstorms have been in the headlines for days now.  It is no secret what needs to be done to improve these forecasts: we need a  convection-resolving ensemble system, running 50-80 model runs at around 2-3 km resolution.   National Academy Committees have recommended this.  Several workshops have recommended this (I know, I chaired two of them, and was on one of the National Academy Committees).  My colleagues in the NWS admit that they need to do this.

To forecast severe convection, requires state-of-the-art 
high-resolution ensemble forecasting systems.

But the National Weather Service hasn't had the computer power to make this next step towards high-resolution prediction over the U.S. a reality.

But here is the amazing part.  A year ago, Congress gave the National Weather Service the money for a very large, 2-3 petaflop weather supercomputer.  A computer, that properly used, could greatly improve skill in predicting both global and local weather.  To quote Louis Uccellini, the head of the National Weather Service, "a real game changer."

But now a year later, NOAA has still  not ordered the computer (the National Weather Service is part of NOAA).  Why?  Because they made the mistake of signing a very long contract with IBM and IBM sold their server division to Lenovo, a Chinese company.  And the U.S. is nervous about having their weather computer provided by the Chinese.
High-resolution (1-3 km grid spacing) convective forecasting using advanced numerical models in an ensemble setting, has the potential to greatly improve severe storms' forecasting in the U.S.  Unfortunately, the National Weather Service lacks the computer power to do so.

There seems to be no leadership at NOAA to quickly resolve this issue.  And there are many solutions. For example, IBM could buy an American computer from Cray, an American company.  In fact, Cray sold new machinse to the European Center and the German weather service recently.  Or the government could use the Lenovo sale to void the contract with IBM and shop elsewhere (like CRAY).   Or they go ahead with IBM deal, with full knowledge that there is little reason for the Chinese to mess with our weather forecast models, particularly since a number of Chinese operational entities use our GFS.


Would our military be content with 4th best hardware when dealing with threats to national security?  I doubt it.  If we had inferior planes or ships, citizens and congressmen would be screaming from the roof tops.  And we know that the NSA or CIA would not tolerate second-rate computers, yet for weather prediction it is ok.  Better weather prediction is our first line of defense against extreme weather, and I thought this administration was worried about global warming induced extreme weather.

Hundreds, if not many thousands, of Americans are dying of weather-related threats, and trillions of dollars of our economic activity is weather sensitive, and we are content with a capacity far inferior to state-of-the art weather forecasting.  U.S. companies pay the European Center millions of dollars a year to secure the world's best forecasts, a national embarrassment.

Today, Kathryn Sullivan, head of NOAA, will be testifying in front of Congress.  Our representatives should ask her how she will solve the NWS computer acquisition problem, quickly. 

6 comments:

Eric said...
This comment has been removed by the author.
Tim Kirby said...

Since the day is past, was there anything useful in Ms. Sullivans testimony ?

clive boulton said...

NOAA could order a D-Wave System, NASA, Google and Lockheed Martin have from the Burnaby, BC based company. Delay could be blessing to leapfrog scientific computing power.

One D-Wave processor runs about 35,500 times faster than a Intel Xeon E5.

https://www.flickr.com/photos/jurvetson/12369089904/

http://www.dwavesys.com/

Victor said...

@clive boulton: getting a new machine is not as easy as it sounds. Supercomputers are actually smaller computers interconnected on the same network.
If the network topology is different, the code has to be modified to be fast.
If processors are different in their instruction sets or even cache sizes, optimized code needs rewriting.
If you are considering heterogeneous computing with GPUs or Xeon Phis, the whole thing has to be rewritten. That's not an easy task as gpu code and algorithms differ greatly from cpu code. Also, I don't know how weather codes behave but I don't think they would gain much from GPUs.

Now with quantum computing, I don't think there is any quantum algorithm for PDE solving or whatever numerical weather prediction uses, and i guess there isn't many people capable of developing such a thing.

SteveT said...

Hey Cliff: It's curious that you mentioned how the military would not be content with the 4th best hardware in dealing with threats to national security.

Wrong: The only other U.S. operational global weather prediction center is FNMOC. It too suffers from inadequate computational resources (among other things),and its global model is consistently worse than the GFS.That translates to potential threats, e.g., from China, may be at a disadvantage when it comes to weather dependent decisions/strategies, etc.

CNY Roger said...

There is an unintended consequence of CAGW funding exemplified by this story: "The new study was among the first conducted on the new 1.5-petaflop Yellowstone supercomputer. The IBM system, operated by NCAR and supported by funding from the NSF and the University of Wyoming, is one of the world’s most powerful computers specifically dedicated to research in the atmospheric and related sciences." http://www2.ucar.edu/atmosnews/news/11540/climate-change-threatens-worsen-us-ozone-pollution

Not surprisingly there are limited funds so not everything can get funded. But the crisis of CAGW manages to find funding for a study that quantifies conclusions that are obvious. Back of the envelope calculations would agree that all things being equal, warmer temperatures lead to more ozone air pollution and that reductions of precursor emissions will reduce ambient concentrations. All this study did was quantify those conclusions a bit better, maybe.

While I cannot provide a reference I am positive that EPA and State ozone air quality modeling is not using as advanced a computer to determine how best to implement reductions of precursor emissions to reduce ambient concentrations.

Sadly funding for better computers for weather forecasting and air quality modeling isn't provided. Both projects that could provide immediate tangible benefits wither while the CAGW funding stream provides obvious and speculative research.