Monday, February 11, 2013

The U.S. Weather Prediction Computer Gap

It happened again. 

A major storm hit the northeast U.S. and the U.S. global model lagged badly behind the predictions of the European Center for Medium Range Weather Forecasting (ECMWF) .  Just as with Sandy.  To illustrate, take a look at the 120-hr forecast of sea level pressure from the ECMWF and U.S. GFS models valid at 4 PM PST Friday, Feb. 9.   First, the observed situation  (colors, the winds at 850 hPa; solid lines, isobars, lines of constant pressure):

A deep low center right off the coast.  A major snow and wind threat.

And there is the 120 hr ECMWF forecast, clearly showing a major storm.


The U.S. GFS model for the same time?  Only predicted a minor trough with little weather (see graphic below).  Not good.  The U.S. model was just as bad at 108 hr out.  Disappointing.


The National Weather Service's own statistics show that the American model had a substantial drop in skill globally during the critical period in question, with inferior performance (black line) compared to the European Center, the  UKMET office, the Canadian Meteorological Center, and even the U.S. Navy (see figure, closer to one is better). 


As I have described in my previous blogs (including here and here), much of the inferiority of U.S. global numerical weather prediction can be traced to the third-rate operational computer resources available to the National Weather Service (NWS)'s  Environmental Modeling Center (EMC), an inferiority that can only be characterized as a national embarrassment.   And as I shall document here, the NWS weather prediction computers are not only inferior to those of other national weather services, but also to NOAA's  computers for weather research and to U.S. climate prediction machines.  Be prepared to be shocked, angry, and disappointed.  And to take action to change this situation.

Let's begin by comparing the most powerful weather prediction computers used by various countries around the world (see graphic below).  Japan and ECMWF are the leaders with about .8 petaflop machines, followed by England (UKMET), S. Korea, and Canada.   The U.S. is at the bottom of the barrel, with about TEN PERCENT of the capacity of the leaders.
Yes, we are talking about the richest nation in the world, and one of the most vulnerable to severe weather.

What makes this even worse is that the U.S. has such a large area (including Alaska, Hawaii, the U.S. mainland, Puerto Rico and the Virgin Islands).

Got your juices going yet?  You haven't seen anything!

Let's compare the computer power availability for operational numerical weather prediction in the NWS to that available to its parent agency, NOAA, for weather research (see graphic).   The NWS operational machine is dwarfed by the NOAA computers that are available for research.  The new NOAA Fairmont machine is five times more powerful, and the NOAA Earth Systems Research Lab/Global Systems Division has TWO far bigger machines.   So operational prediction, which saves lives and promotes the economy of the nation, gets crippled by lack of computer power, while researchers get the big machines.  Folks, some administrators in NOAA are making very bad decisions.  And their bosses in the Department of Commerce are going along with it.
Not steamed yet?   Then take a look at a comparison between the U.S. operational weather prediction computer capacity and a few of the U.S. machines available for climate research (overwhelming used for long-term climate simulations).   I did not list every major computer available for climate.  Climate-dedicated computers absolutely dwarf operational weather prediction computer capacity, so much so that you can barely see the operational computer resource on this plot!  Just considering  the machines shown in the figure, climate simulation has about FIVE HUNDRED TIMES the computer power available for operational weather prediction. 
Folks, this is outrageous.   Weather prediction is critical for the U.S. economy and for public safety.  And even if you are worried about climate change, the number one thing one needs to encourage resilience in a changing climate is to have good weather prediction! What is the logic for giving climate research hundreds of times more computer power than weather prediction?  It makes no sense from a rational viewpoint.

A big part of the problem is that NOAA management has decided to put priority on the oceans and climate while they short-change weather prediction.  This has been a deliberate and long-term policy. Unfortunately, the U.S. Congress has not reined them in. The irony is that NOAA understands how important and popular the National Weather Service is and demands that NWS products have NOAA stamped all over it--while draining critical resources the NWS requires to do a proper job.

The time to fix this self-inflicted problem is now.   There is a great deal of money in the Hurricane Sandy relief bill for improving hurricane and storm prediction or storm-related infrastructure  (over 100 million dollars).  Some of this money should be used to fix the NWS computer deficiency (it would take about 50 million dollars).   There is nothing that would more effectively improve hurricane prediction than dealing with the computer gap. We are not talking about increasing NWS computer capacity by 30%: it needs to be increased by 10-100 times to properly serve the nation.  Consider that NOAA is planning on spending 44 million to replace the wings on two hurricane hunter aircraft:  the same money would revolutionize and greatly improve U.S. weather prediction if used on computers.  If the Sandy money is not available, other funds should be found.  Enhanced computers is absolutely core infrastructure for the U.S. and would pay for itself many, many times over.  So many other nations understand this and have committed the necessary financial resources to secure bigger computers for their weather services.

NOAA folks have proven themselves to be unable to deal with this issue, so I recommend you contact your Congressman or Senator to complain.  Only Congress has the clout to fix the situation.  And only by your complaining and making this a major issue, will Congress take it seriously.

23 comments:

Lucas Harris said...

Hi, Cliff. Two points:

- First, when you plot the capacity for Titan, is that the whole machine or just the allocation granted to NOAA? The machine is not a dedicated climate machine, but is also to be used for nuclear reactor design, superconductivity, and other DOE projects. Also, although we at GFDL (NOAA's climate modeling center) have been granted an allocation on Titan, we can't make much use of it! Much of the new upgrades to Titan are Graphical Processing Units (GPUs) which our current models cannot make use of. Already DOE has expended a significant amount of resources to make one of NCAR's new climate models to run on a GPU system...and the version GPU version is already lagging behind the scientific development of the traditional CPU version.

- Second, is it necessarily so that the existing GFS model could easily run at higher resolution simply by throwing a huge number of extra processing power at it? GFS is a spectral model which scales poorly to larger numbers of CPUs. ECMWF's model, which is also spectral, has had a very large engineering effort to make it run efficiently on larger number of processors, and even then it still does not scale well. (Newer models such as the NMM-B, ESRL FIM and the GFDL cubed-sphere global model are finite difference or finite volume models, and scale much better than GFS.)

Mike Smith said...

Cliff,

I'm sure you don't intend it that way so I suggest you clarify that you are not against replacing the hurricane hunters' wings.

Given the crisis in the weather satellite program, the decommissioning of the profilers and the essential requirement to put aircraft into hurricanes, the best computer capability in the world won't matter without the data to initialize the models.

Best wishes, Mike

Benjamin said...

The questions is, how do we get this US Congress to take any matter of science seriously? There seems to be an ever widening knowledge gap between the decision makers and the people who actually know what they're talking about. This is true in many areas of policy, not just emergency management and weather prediction.

Between now and election day, what can we possibly do to get folks who don't seem to care at all to give a damn? Cliff Mass for Congress?

bp1002 said...

One solution to the problem could be distributed computing. I would imagine that all of the desktops that are lying around Federal agencies could be bundled together into a pretty hefty supercomputer. We could get legislation to direct all of the IT departments of the agencies to distribute a software package that would utilize unused computing power- such as at night or the extra leftover power when an employee is only typing in Word. That software would accept work from NOAA to crunch numbers and send it back when it's done. I'm sure there are many logistical problems that would need to be worked out but similar programs already exist for the SETI and Folding@home distributed computing projects.

Cliff Mass Weather Blog said...

Mike,
You are right. I am NOT saying the the NOAA P3 aircraft don't need the wings. What I AM saying is that there are large amounts of money available to improve hurricane prediction but little of it gets to improving the computers that are the core infrastructure for prediction. ECMWF does better and their greater computer power is a major reason for this...cliff

Cliff Mass Weather Blog said...

BP
Distributed computation (like for SETI) does not work for numerical weather prediction...the communication issue is crippling....cliff

JeffB said...

Cry me an atmospheric river. For one, the government is bankrupt and for two, all the money is going to worthless climate fear studies. Our government does not take the practical and useful seriously over hysteria laden future predictions. So why should any of her citizens?

If you want change, then campaign to end the funding of climate hysteria. And that'll take guts because some of the biggest beneficiaries of useless climate $$$$ are your colleagues in academia who immorally take those grants to study useless minutia.

richard583 said...

.. From CA, I've just sent both of my state's, main Senators an invitation to review what you've outlined here above most recently focusing on the idea, professor.

— An online petition, might also be a decent idea where regarding this broader theme and problem.

Sysiphus said...

That a model developed and run by the EU is outdoing our own models on forecasts for our own country is pathetic and speaks volumes about our national priorities. We are rapidly becoming a country driven by superstition and second rate technology. How long until we become a second rate country?

Dan Satterfield said...

Cliff,
I'd love to share this blog post for the American Geophysical Union Blogosphere. Would include a link back prominently as well. Excellent post!

Dan
ps you can reply here or email me at dannysatterfield **at ** mac dotcom

RyanS said...

Hi Cliff,

You highlight 2 events where the ECMWF performed better than the GFS. Is there other data that indicates that this is a systemic problem and that the ECMWF consistently outperforms the GFS?

Assuming that there is, how do we know that it's computing resources that are the problem, and not the overall quality of the model, or the quality of the input data (or other factors I'm not thinking of)?

I'm a mere amateur, but would love to read more in depth writing on the topic.

Thanks,
Ryan

Cliff Mass Weather Blog said...

Ryan,
Yes...there is extensive data (check the NCEP verification web page among others) that shows that this is a consistent problem. Well known in the field. We know that computer power is a big issue here. For example, ECMWF has been able to use 4dvar data assimilation because they have the computer power and lots of research has shown the value of resolution (which requires the computer power). ...cliff

Henk said...

Why not merging those 2 to one worldwide predictions center.

It's better, cheaper

Regards

HBP

Random Menace said...

Why can't we just use the ECMWF model? If it's produced regularly, and covers the US, why do need to have our own forecast model?

Scott Mackaro said...

Cliff,

Thanks for continuing to put this topic into the minds of the community. I agree that computing is a big problem, especially on the data assimilation front. With that said, it isn't the only problem. Putting a bad driver in a faster car doesn't make them a better driver!

All aspects of our modeling systems need to be updated. This includes better data management, data quality, data assimilation systems, and the model itself.

I for one do not understand why weather prediction isn't a matter of national security and funded as such.

Rose Doctor said...

Cliff,
To put some dimensions on JeffB's comments above: According to the Government Accounting Office (Reported in Forbes online)Federal spending on "climate change" (read global warming) in 2010 was $8.8 Billion. That's about $24 million per day.

Gpacharlie said...

I emailed my congressional representative regarding this concern. My REP is Dave Reichert. Please email your REP or call or write a letter and encourage others to do the same.

counters said...

JeffB and Rose Doctor,

To be fair, Dr. Mass has argued not that we should throw money at NWP in the United States; instead, we should leverage funding which has already been allocated to tackle the problem. For instance, in a blog a short while ago, he argued that smart investment of the money earmarked to NOAA has part of the Hurricane Sandy relief package could go a very long way towards retrofitting our computational capacity. Elsewhere, he (and the National Research Council) have pointed out that even just a re-organization of research activities supporting NWP into a consolidated, translational research-to-operations pipeline could help fix the problem as well.

The $8.8 billion figure for "climate change activities" comes from a GAO study (synopsis here - http://www.gao.gov/assets/320/318570.pdf; report here - http://www.gao.gov/assets/320/318556.pdf). You need to be careful to distinguish how that money is partitioned. "Technology" funding in the report is specifically earmarked towards, "[that] which includes the research, development, and deployment of technologies and processes to educe greenhouse gas emissions or increase energy efficiency." As you might imagine, a significant portion of this is driven into the innovation policy system in the US. Figure 1 of the report illustrates that "science" funding - that money invested in basic research - has remained stagnant at ~$2 billion/year for nearly two decades. It's not easy to identify how that money is spent, but I'd imagine it's distributed over a much larger body of research than you'd expect.

TimS said...

HI Cliff

I recently discovered your blog and am really enjoying it. As you mention, it is both a computer resources problem, and a problem of improved numerical methods - we will need both for significant improvement to occur. I also wonder if the mission of NCEP has to be carefully defined as I know that ECMWF is able to focus on the mid to longer term forecast while individual countries can use their models for the 24-72hour forecast.
Finally the idea that our climate research colleagues are immoral is a contemptible comment. We are not in a contest with critical climate research. Rather Congress needs to both fully fund weather prediction AND create a climate service so we can focus our federal and academic resources on all the important questions that are apparent to a successful prediction capability. Tim

João da Silva said...

INPE's (National Institute For Space Research) CPTEC has the Tupã, a Cray XE6 258 Teraflops machine.
http://supercomputacao.inpe.br/recursos2
CPTEC does research and also runs global forecast and climatic models.

Ministry of Agriculture's INMET has their own SGI 55.7 Tflops machine.
http://www.inmet.gov.br/portal/index.php?r=noticia/visualizarNoticia&id=45
INMET is a Regional Meteorological Center.

Although Brazil isn't self-sufficient, as it relies on GFS analysis to initialize their models.

Barry Goldsmith said...

On one hand, this is an important issue that needs to be addressed. On another, anyone believing that ECMWF is the platinum standard had better take a dose of "caution" medicine. Ladies and Gentlemen...I give you the ECMWF "Hurricane" Debby forecast from June 2012. http://www.nhc.noaa.gov/archive/2012/al04/al042012.discus.005.shtml?

David Lyman said...

Cliff, Does not PassageWeather.com get its information from European sources? Are the models from Japan, the UK and other well equipped nations available to us in the USA? If so, where not our boys watching what the Brits were predicting when Sandy developed?
Is weather predicting competitive and copyrighted? Is it a commodity to be protected and sold, or is it free for all who need it?
Capt. David

Tom said...

To those worrying about the $8.8 billion dollar expenditure on climate research, keep in mind our budget on defense spending is larger than countries 2-9 in defense spending combined.

There are dollars that can be reallocated, wisely, while other buckets get slashed from our "bankrupt" government. Considering storm preparation and response are two key, and misunderstood from our leaders, parts of maintaining an effective national defense, one could utilize these dollars from defense spending without developing some high tech weaponry that may/may not even be used.

This then leads down the inevitable path of why are we bankrupt...and that's a whole other conversation for another time...and one I would hope would be held without clinging to partisan talking points.