Wednesday, February 27, 2013

Eastern Washington Drought?

The last two months have been extraordinarily dry east of the Cascades crest, even for the sage brush laden, semi-arid zone extending into the Columbia Basin.  To explore this, lets start with the precipitation during the past 60 days (see image below).  Some locations on the eastern slopes have gotten less than .1 inches during the period.
The percentage of normal precipitation shows a similar story: many folks on the east side had less than 25% of normal.  You can see that much of the rest of the state was fairly dry...but not that dry.
 Let's plot the precipitation at Yakima for the past twelve weeks.  Essentially nothing since early January.
 Pasco was a bit wetter, but not by much.
So, how serious is this "drought?"  The answer has a lot of economic implications for eastern Washington crops, particularly the dry land farmers of the Palouse.  Last year they made out like bandits, as drought in the Midwest and Plains caused crop prices to skyrocket, while eastern Washington farmers had excellent crops.

The National Weather Service has a "drought monitor" website that summarizes the sub-surface water situation (see graphic).  No drought in eastern Washington (white color), while much of the southwest  and Plains States were dry.  It turns out that although the past two months have been dry in eastern Washington, the last year has been near normal.


To illustrate this fact, here is the observed and normal precipitation at Pasco. 
Basically, a wet fall balanced out a dry winter.
Pretty much the same story at Yakima:
Snow over the Great Plains has put a small dent in the severe drought in the nation's midsection, but at this point I suspect that eastern Washington farmers will again have a highly profitable year as their fellows over the Great Plains and Midwest suffer from the effects of drought.  At least that is what the National Weather Service is predicting:

Reminder;  the NW Weather Workshop, which is open to all, starts on Friday.  To see the agenda and to register go to http://www.atmos.washington.edu/pnww


Monday, February 25, 2013

Second-Rate U.S. Numerical Weather Prediction: Why You Should Care


In several of my previous blogs I noted that U.S. numerical weather prediction is lagging behind the European Center and others--a diagnosis pretty much universally accepted in my field.  I listed some of the reasons:  inferior computers, poor management, lack of effective leadership, inability to tap the large U.S. weather research community, and others.

But some highly placed folks in the National Weather Service (NWS) and elsewhere have argued that U.S. inferiority in numerical weather prediction really doesn’t matter, since U.S. government forecasters have access to the superior forecasts of the models of the European Center (EC), the UK Met Office, and others.   As a prime example, they noted that EC model forecasts played a major role in NWS warnings in the week before Hurricane Sandy made landfall.  Some European Center officials have said the same thing--the current situation is fine!  Weather prediction is a global entity, so why be concerned if they are ahead?

In this blog, I will take issue with these arguments and will suggest that first-rate numerical weather prediction by the U.S. National Weather Service is crucial for the nation and of great benefit to the entire world.  That it is one of the cost-effective investments our nation could make.

Let me make a few points.

Point 1:  U.S. has the potential to be far superior to the European Center and others.

Forget the defeatist rhetoric.  The U.S. has by far the largest meteorological research establishment in the world.  We spend more money on weather research than any other nation.   Why should anyone believe that the best we can do is to follow or equal the EC?  As a numerical modeler myself I firmly believe that our global model could far exceed the performance of the rather conservative EC effort.   To go technical on you for a second, I believe that using ensemble-based data assimilation (in space and time), far better model physics, higher resolution, and better use of observational assets, we could produce vastly improved forecasts, far superior to the EC, with huge economic and safety benefits to the nation.

To put it another way.  Think of the U.S. and the European Center weather prediction efforts as two cars.  One (the U.S.)  has a much bigger engine (knowledge and research base) than the other.   But the big-engine car has a very inefficient transmission (and other deficits) and ends up going slower than the small-engined competition (EC).  Yes, we can hitch a ride with Europeans, but our car could leave them in the dust, if only we had the will to do so.
The U.S. governmental weather prediction effort looks like this...a very big research engine, without the ability to use U.S. massive research power to move our weather prediction forward rapidly.  The guy in the red pants is a NOAA bureaucrat.

Point 2:  U.S. national and regional prediction is a shadow of where it should be.

A lot of the discussion in this blog and elsewhere has been about the inferiority of the U.S. global model (the GFS) to global models of others (EC and UKMET).   But as important as the global models are, they are only part of the story.  The lack of computer power and poor coordination between research and operational weather communities in the U.S. has crippled our ability to move forward towards the high-resolution weather prediction capability that we know represents the future:  probabilistic prediction.  And remember the EC only does global modeling--they are not interested in high-resolution prediction over the U.S. 

It is clear that the future of weather prediction will be to forecast probabilistically for all parameters, with the essential infrastructure to do this being high-resolution ensembles (meaning we will run our forecast models MANY times --say 25-100 times--using different model starting points and model physics).  A number of National Academy study groups have recommended this approach (I have been a member of several of them!) and noted that such ensembles must be high enough resolution (2-4 km grid spacing) to  resolve convection (thunderstorms).   You want to predict major convective outbreaks, like the UNFORECAST derecho (strong convective system with powerful winds) that hit the northeast U.S. last June?  You need this capability.  But the NWS Environmental Modeling Center (EMC) does not have this critical capability because they don't have the computer resources, among other reasons.

In fact, they don't have the computer power to run even current generation weather technology.  For example, the NOAA Earth System Research Lab (ESRL) has developed a new high-resolution prediction system called HRRR (High Resolution Rapid Refresh) that was able to predict the powerful derecho hours before (see graphic).


In contrast, the NWS EMC model (NAM) failed (see graphic below).  The NWS EMC does not have the computer power today to run HRRR operationally.
The NWS NAM model missed this important event 12h before

Point 3:  U.S. modeling inferiority is costing the U.S. private sector big bucks and denying real-time access to the U.S. research community.

U.S. business and governmental interests, such as weather prediction firms and utilities, need the best forecasts, with even modest differences in skill having big financial implications.  Thus, a number of companies and U.S. entities are paying hundreds of the thousands of dollars EACH to get European Center model output.  Yes, we are talking about millions of dollars that is being used to support the EC modeling effort.  The Europeans have a different financial model than the U.S. National Weather Service, one patterned after ancient empires:  they require financial tribute from those wishing the best meteorological "protection The U.S., to its credit, provides model output for free, which not only fosters commerce, but assists nations and researchers all over the world.  But what we give away is clearly not as good as EC's global model, thus allowing the rest of the world to help support their modeling efforts.

Even the U.S. research community is required to pay for real-time access to EC grids.  A long time ago I was able to get EC grids because I was working with an EC researcher.  The EC bureaucrats caught this infraction, cut off the grids, and offered me a "deep discount" rate of $50, 000 a year!  My research using the superior EC grids (which required real-time access) was over.  And my situation is repeated many times over with the rest of the U.S. weather research community.

Epilogue

This weekend I went to a talk by a Stanford political scientist who noted that great nations generally don't fail from external threats--rather they weaken from within.  They get lazy, inefficient, lose their edge, and start making bad decisions.   U.S. operational numerical weather prediction is a prime example of a nation resting on it laurels and falling behind.  We invented numerical weather prediction.  Most scientific and technical advances  in weather prediction have occurred here and STILL DO,   Our research community is still largest and dominant. And with all of that we have lost leadership and have fallen well back into the pack.  And the cost is not just in prestige, but also a weakening of  our nation's economic prowess and needlessly jeopardizing life and limb.

As explained in my previous blogs the route to fixing U.S. numerical weather prediction is clear, including:

(1) Secure sufficient computer power, either by redirecting some of the huge computer resources acquired for climate prediction or using new funds (like money already appropriated in the Hurricane Sandy relief bill)

(2)  Integrate numerical weather prediction research and operations, including replacing the ineffective division between the NWS EMC and the NOAA labs.  NOAA must dedicate enough extramural research funds to entrain the U.S. research community.

(3) The need for effective leadership in NOAA, including a clear vision where weather prediction is going in the U.S.  

There is a real cost to inferior U.S. numerical weather prediction and I am confident the numbers are easily in the billions of dollars.  We have let a crucial piece of the technological infrastructure of our nation to weaken and atrophy.  We must take a new course.

Saturday, February 23, 2013

Radar, Wind, and Snow

The front has gone by, the winds have lessened, and heavy snow is falling on the western slopes of the Cascades.    But first, let me answer a frequently asked question:  what is the persistent echo seen just offshore of Hoquiam in many images from the Langley Hill radar? (see below).   This pattern is seen persistently in the lower radar scans and looks like a half circle.    The answer: the lower portion of the radar beam is hitting the surface and reflecting back to the radar, which is located a few miles northwest of Hoquiam.   Such a phenomenon is known as ocean or sea clutter.   The image below is from the half-degree elevation angle scan of the Langley radar, which means the center of the beam is a half-degree above the horizontal.  The width of the beam is roughly 1 degree, so the lower portion of the beam skims the surface in the vicinity of the radar.

Some of you might ask, doesn't the radar also hit the ground and terrain?  Why don't we see a return there as well?  The answer is that you would except for the fact the radar has ground clutter suppression software that subtracts that out.   Land and mountains stay put and so the radar can be trained to ignore returns from them. Not so easy with an ocean surface with waves and swell.  We knew we would get some sea clutter with the new radar, but it is a small price to pay for the capability to see farther out into the Pacific.  In fact, due to the intercession of Senator Cantwell, our radar has the capability to scan lower than any other NWS radar in the country:  .15 degrees.  Yes, more ground clutter, but more offshore range.

Let's compare the radar imagery at .15, .5, and 1.5 degree elevation angles this morning (7:02 AM).  First .15 degrees....lots of sea clutter, but you can see the shallow convective showers way offshore.



.5 degrees--the range pulls in a bit.

1.5 degrees:  no sea clutter, but the horizontal range is much less.

With taller targets, the .15 degree elevation angle can see 300 km offshore or more.

This radar image also shows weak convective showers over the ocean in the cold, unstable air and substantial enhancement of the precipitation as the air is forced to rise by Olympics and coastal mountains.

Finally, the winds.   The vigorous cold front and the strong northwesterlies than followed produced some strong winds over the region--but nothing truly damaging or exciting.  Here are the max winds (mph)during the past 24 hr.  Gusts above 50 mph in and downwind of the Strait and some 40s over the Sound.  Notice how quickly the winds weakened over land.


 And yes, snow.  This kind of cool, unstable, northwest flow pattern is great for snow:  expect snow totals of 1-2 feet above 3500 feet.  The mountains needed some fresh snow.   Skiers will be happy.

Reminder:  if any of you want to attend the Northwest Weather Workshop on March 1-2 (next weekend) in Seattle, you can get more information and register here.

Wednesday, February 20, 2013

Weather Returns to the Northwest

I knew it was a mistake to blog about boring weather....

For those of us who have forgotten what Northwest weather can bring, we are about to get a reminder.

But before I talk about that, let me invite you to the Northwest Weather Workshop, which will be taking place in a little over a week (March 1-2).  This is the big local meeting for those interested in Northwest weather, with a wide variety of talks, most of which are accessible to those who don't have a formal meteorological background.  It starts at 1 PM on Friday, March 1 and ends Saturday around 3 PM.  We have a very nice banquet at Talaris Conference Center in Seattle, with a special speaker (Susan Joslyn, a UW psychology professor who specializes in how weather information is communicated.)   More information is found here.   You need to register if you would like to go!

 Now the coming weather...

The upper level ridge that has been over or just east of us has faded and strong flow from the west is now headed towards are region.  To illustrate, here is the  upper level (500 hPa, roughly 18,000 ft) flow (solid lines are the heights of the 500 hPa pressure surface) for Friday morning.  Winds are nearly parallel to the lines and are stronger when the lines are closer together.  A series of disturbances is lined up to move into our region.

In concert with the approaching disturbances, this strong incoming flow (also known as the jet stream) will be bringing a current of high amounts of moisture, moisture that was injected northward over the western Pacific.  Here is the forecast moisture distribution on Friday morning.  Red indicates high values.

Expect plenty of rain and snow.  Here are the total precipitation and snowfall for the 72 hr beginning Wed. morning at 4 AM.  2-5 inches on the western slopes over most the windward (western) slopes of the coastal and Cascade Mountains.   Yes...not a record atmospheric river...but a good soaker.

Temperatures will be cool enough that a lot of this precipitation will be snow in the mountains.  We are talking about 2-3 feet of snow, particularly above roughly 3500 ft.

And winds?  Expect strong winds over the eastern Pacific and Northwest Washington on Friday as a strong low moves to our north.  (see map of sustained winds below).  30 kts plus sustained winds along and off the coast. Considerably stronger gusts.

And you like waves?  With a strong weather system on Friday and powerful NW winds behind, there will be substantial waves on the ocean beaches (see graphic valid Saturday morning).  Yellow indicate waves of 8-9 meters, 25-30 ft.   Good wave watching on Saturday afternoon at Westport.

So enjoy the active weather...nothing that will cause a lot of damage, but enough to remain us that Northwest weather can be fun.

Monday, February 18, 2013

Misunderstood Probablility

Probability of precipitation is given on virtually every weather forecast, but many people don't really understand what it is and where it comes from.

Let's test your knowledge of this important weather concept.

A probability of precipitation of 30% means:

(1)  One expects precipitation for 30% of the time.
(2)  One expects precipitation over 30% of the area.
(3)   There is a 30% chance of precipitation at some point over a particular period.
(4)  That it will precipitate 30% of the time over 30% of the area over a particular period.

Time's up.  Write it down.  The answer is number three. Probabilities are always given for a point in space over a standard period (most frequently over 12 hr time chunks).  So at that location over the specified period for similar weather conditions, we would expect it to precipitate 3 out of 10 times.
If you got it wrong don't worry about it....a lot of people do.   Consider the study by some European scientists in the journal Risk Analysis (found here).  They questioned folks in Amsterdam, Athens, Berlin, Milan, and New York about what a “30% chance of rain tomorrow” meant.  Only in New York did the majority of the survey group supply the correct answer.  You have got to respect those New Yorkers; they walk and talk fast, but boy do they know their probabilities.  In the European cities, the preferred (and incorrect) interpretations were that it will rain tomorrow “30% of the time,” followed by “in 30% of the area.”   So much for European sophistication.

Does being precipitation soaked give Northwesterners  more insight into precipitation probabilities?   University of Washington psychology professor and expert in weather information interpretation, Dr. Susan Joslyn, has completed several studies encompassing hundreds of students to answer this very question.   As described in an article in the Bulletin of the American Meteorological Society (found here), she and her colleagues found that only roughly 50% of the sodden UW students got the right answer.  Disappointing!
One more interesting question.  When was the term probability first used in a weather forecast?  The amazing answer:  the very first forecast!

Following the signing by President Ulysses S. Grant of an authorization to establish a system of weather observations and warnings of approaching storms, on February 19, 1871, Cleveland Abbe issued the first “official” public Weather Synopsis and Probabilities based on observations taken at 7:35 a.m. that day: 

"Synopsis for past twenty-four hours; the barometric pressure had diminished in the southern and Gulf states this morning; it has remained nearly stationary on the Lakes. A decided diminution has appeared unannounced in Missouri accompanied with a rapid rise in the thermometer which is felt as far east as Cincinnati; the barometer in Missouri is about four-tenths of an inch lower than on Erie and on the Gulf. Fresh north and west winds are prevailing in the north; southerly winds in the south. Probabilities: it is probable that the low pressure in Missouri will make itself felt decidedly tomorrow with northerly winds and clouds on the Lakes, and brisk southerly winds on the Gulf."

For his insistence in using the term probabilities, Cleveland Abbe was given the name "Old Probs."  In a future blog, I will describe how meteorologists come up with probabilities.  Be prepared, this is the meteorological version of sausage making.

"Old Probs" Cleveland Abbe:  The first official U.S. weather forecaster!

 

Saturday, February 16, 2013

The Most Boring Winter in Seattle History

Many of you have complained about this winter.  That the weather has been entirely boring:  no snowstorms, no real windstorms, no extreme temperatures, no pineapples expresses,  no nothing.

 To try to quantify our plight, I and other local weather scientists have developed a new, and I hope, authoritative measure of interesting weather here in Seattle:
The Seattle Winter Excitement Index (SWEI).  (correct pronunciation is "swee")

SWEI is calculated over the core of Seattle's winter (Nov. 15-Feb 15) and is the sum of several components parts:

(1) The number of days the temperature exceed 60F or drops below 25F.
(2) The number of days with two inches or more of precipitation.
(3) The number of days with sustained winds of 30 kt or more.
(4)  The number of months with more than 1 inch of snowfall.

All inputs are from Seattle Tacoma Airport.  I should note there are rigorous reasons for each of the above criteria.  For example, many official groups (like this National Weather Service site) consider that hard freezes occur below 25F.   Plants die.  Local meteorologists note that wind damage often begins when sustained winds hit 30 kt or more.  And local mayors confirm that even 1 inch over a month brings tension, excitement, and danger to local roadways.  Folks, this is rigorous science.


I asked my department's star data analyst, Neal Johnson, to run the numbers for the entire record (1948- today) at Seattle Tacoma Airport, the main climatological site for western Washington.

The results are in.  Be prepared for sobering news.

This winter is the most boring and uneventful based on the SWEI index described above.  Actually, we are tied for most boring with 1963-1964.  Specifically, 2012-13 and 1963-1964 had the lowest values of the SWEI index.

Only old-timers who can remember back nearly FIFTY YEARS can wax nostalgic about such a boring winter.  For the quantitative among you, here are the ranks of the top boring years (a lot of ties)

19631115-19640215   1
20121115-20130215   1
20001115-20010215   2
20011115-20020215   2
20021115-20030215   2
19601115-19610215   3
19751115-19760215   3
19821115-19830215   3
19911115-19920215   3
19971115-19980215   3
19991115-20000215   3

The signs of profoundly boring weather is everywhere.  For example, the latest snow pack map shows that, well, we are averaging near 100% (see below).  Big surprise.


Or plot the cumulative precipitation at Sea Tac and compare to climatological values over the last three months (see graphic).  We come out very slightly below normal.  And no big one-day precipitation event.  Long periods of virtually nothing.  Yawn.

The best we could do in extreme weather this year was an extended period of low-clouds/temperature-inversion and one incident of high tides, the later mainly caused by astronomical features and a modest low.  Not good enough. No wonder the Weather Channel has ignored us.  Jim Cantore will travel elsewhere.

Extreme Weather in Seattle
The natural question of many of you is why?  What convergence of unusual events, what causative factors can produce such an anomaly?

 I do not have an answer.  

But I have heard that some folks at the web sites Skeptical Science and 350.org have suggested that such extremely boring weather is "consistent" with what one might expect from increasing greenhouse gases in the atmosphere.  Time will tell.

I realize that there is some danger in writing this particular blog.   Like putting a red flag in front of a meteorological bull.   But I am willing to take the risk and quite honestly we need some excitement around here.


Wednesday, February 13, 2013

Island Clouds

Can an island that reaches a few hundred feet in elevation create its own clouds?   The answer is yes, and some wonderful video of southern Whidbey Island yesterday afternoon shows the action.

Image from the video at 1:12 PM Feb. 12
 The video was provided by Greg Johnson, who maintains the excellent skunkbayweather web site.  He has two cams looking northeastward towards Whidbey Island from his home in Skunk Bay (Hansville) on the Kitsap Peninsula (see map).


The temperature and dew point at Skunk Bay (see graphic) and other nearby sites (e.g., Wahl Rd on Whidbey Is, also shown) were close but not at saturation (the dew point was a few degrees less than the temperature).
Whidbey Island is about 300 ft high for the peninsula in view, with a relatively abrupt cliff facing the southeast. (see blow-up of the terrain below) The winds were from the southeast over the area and so

air was forced up quickly along that share..  As the air was forced to rise, it cooled (air moving from higher pressure to lower pressure expands and cools).  The temperature fell to the dew point and the air became saturated, producing a cloud.  As shown in the picture above, and the video below, the cloud was not only maintained over the land, but streamed downwind of the island.

How much will air cool moving up a slope?   If we assume that the air rose 100 meters (328 ft) and assume the dry adiabatic lapse rate (9.8 C per km, the rate that unsaturated air parcels cool when they are forced to rise), we get a decrease of .98C or 1.8F, which was enough to bring the temperature down to the dew point.

Greg Johnson produced a wonderful slow-motion video of the event; click on the image to view it yourself.


If you want real visual treat, check out his melding of TWO cams into one wide image--it doesn't get much better than that! (click on image below):

 

We may not have gotten megastorms this season like the NE U.S., but our weather has its own fascinations and subtleties.  Like the subtle flavors and accents of a fine wine compared to the overwhelming flavors of a highly sweetened soft drink.   Northwesterners have sophisticated meteorological palates.

Monday, February 11, 2013

The U.S. Weather Prediction Computer Gap

It happened again. 

A major storm hit the northeast U.S. and the U.S. global model lagged badly behind the predictions of the European Center for Medium Range Weather Forecasting (ECMWF) .  Just as with Sandy.  To illustrate, take a look at the 120-hr forecast of sea level pressure from the ECMWF and U.S. GFS models valid at 4 PM PST Friday, Feb. 9.   First, the observed situation  (colors, the winds at 850 hPa; solid lines, isobars, lines of constant pressure):

A deep low center right off the coast.  A major snow and wind threat.

And there is the 120 hr ECMWF forecast, clearly showing a major storm.


The U.S. GFS model for the same time?  Only predicted a minor trough with little weather (see graphic below).  Not good.  The U.S. model was just as bad at 108 hr out.  Disappointing.


The National Weather Service's own statistics show that the American model had a substantial drop in skill globally during the critical period in question, with inferior performance (black line) compared to the European Center, the  UKMET office, the Canadian Meteorological Center, and even the U.S. Navy (see figure, closer to one is better). 


As I have described in my previous blogs (including here and here), much of the inferiority of U.S. global numerical weather prediction can be traced to the third-rate operational computer resources available to the National Weather Service (NWS)'s  Environmental Modeling Center (EMC), an inferiority that can only be characterized as a national embarrassment.   And as I shall document here, the NWS weather prediction computers are not only inferior to those of other national weather services, but also to NOAA's  computers for weather research and to U.S. climate prediction machines.  Be prepared to be shocked, angry, and disappointed.  And to take action to change this situation.

Let's begin by comparing the most powerful weather prediction computers used by various countries around the world (see graphic below).  Japan and ECMWF are the leaders with about .8 petaflop machines, followed by England (UKMET), S. Korea, and Canada.   The U.S. is at the bottom of the barrel, with about TEN PERCENT of the capacity of the leaders.
Yes, we are talking about the richest nation in the world, and one of the most vulnerable to severe weather.

What makes this even worse is that the U.S. has such a large area (including Alaska, Hawaii, the U.S. mainland, Puerto Rico and the Virgin Islands).

Got your juices going yet?  You haven't seen anything!

Let's compare the computer power availability for operational numerical weather prediction in the NWS to that available to its parent agency, NOAA, for weather research (see graphic).   The NWS operational machine is dwarfed by the NOAA computers that are available for research.  The new NOAA Fairmont machine is five times more powerful, and the NOAA Earth Systems Research Lab/Global Systems Division has TWO far bigger machines.   So operational prediction, which saves lives and promotes the economy of the nation, gets crippled by lack of computer power, while researchers get the big machines.  Folks, some administrators in NOAA are making very bad decisions.  And their bosses in the Department of Commerce are going along with it.
Not steamed yet?   Then take a look at a comparison between the U.S. operational weather prediction computer capacity and a few of the U.S. machines available for climate research (overwhelming used for long-term climate simulations).   I did not list every major computer available for climate.  Climate-dedicated computers absolutely dwarf operational weather prediction computer capacity, so much so that you can barely see the operational computer resource on this plot!  Just considering  the machines shown in the figure, climate simulation has about FIVE HUNDRED TIMES the computer power available for operational weather prediction. 
Folks, this is outrageous.   Weather prediction is critical for the U.S. economy and for public safety.  And even if you are worried about climate change, the number one thing one needs to encourage resilience in a changing climate is to have good weather prediction! What is the logic for giving climate research hundreds of times more computer power than weather prediction?  It makes no sense from a rational viewpoint.

A big part of the problem is that NOAA management has decided to put priority on the oceans and climate while they short-change weather prediction.  This has been a deliberate and long-term policy. Unfortunately, the U.S. Congress has not reined them in. The irony is that NOAA understands how important and popular the National Weather Service is and demands that NWS products have NOAA stamped all over it--while draining critical resources the NWS requires to do a proper job.

The time to fix this self-inflicted problem is now.   There is a great deal of money in the Hurricane Sandy relief bill for improving hurricane and storm prediction or storm-related infrastructure  (over 100 million dollars).  Some of this money should be used to fix the NWS computer deficiency (it would take about 50 million dollars).   There is nothing that would more effectively improve hurricane prediction than dealing with the computer gap. We are not talking about increasing NWS computer capacity by 30%: it needs to be increased by 10-100 times to properly serve the nation.  Consider that NOAA is planning on spending 44 million to replace the wings on two hurricane hunter aircraft:  the same money would revolutionize and greatly improve U.S. weather prediction if used on computers.  If the Sandy money is not available, other funds should be found.  Enhanced computers is absolutely core infrastructure for the U.S. and would pay for itself many, many times over.  So many other nations understand this and have committed the necessary financial resources to secure bigger computers for their weather services.

NOAA folks have proven themselves to be unable to deal with this issue, so I recommend you contact your Congressman or Senator to complain.  Only Congress has the clout to fix the situation.  And only by your complaining and making this a major issue, will Congress take it seriously.

Saturday, February 9, 2013

High Pressure Uncertainty

While the the Northeast U.S. is staggering under the onslaught of heavy snow and strong winds, Northwest meteorologists are dealing with one of our most difficult winter forecasting problems:  with high pressure over us, will low clouds and fog develop and remain over most of the day?    As we will see this is really a hard problem that plays to many of our weaknesses.


High pressure or ridging, as meteorologist often call it, should be be associated with sunny, fair weather, right?   Not over the NW lowlands in winter in many cases.

High pressure IS generally associated with a lack of heavy or moderate precipitation and the absence of storms, mainly because it is associated with sinking air.  Sinking air is poison to storms and is associated with warming aloft, sinking air is compressed as it descends (pressure increases towards the surface, of course).

High pressure in the winter often brings low level inversions, where temperature warms with height.  Why?   As shown by the diagram, the sinking associated with high pressure has to decrease near the surface for the simple reason that air can't move through the surface.  Sinking is stronger aloft and thus compressional heating is stronger aloft.   More heating aloft and less near the surface helps to build an inversion.


But there is another reason.   Sinking air aloft kills middle and upper clouds.  That allows the surface to radiate infrared heat to space, thus cooling it.   With a heater aloft and a cooling mechanism near the surface you get an even stronger inversion


What about heating of the surface by the sun during the day?   That would work against an inversion.  Unfortunately, our solar heating is very weak during midwinter and, of course, during our long mid-winter nights there is no solar radiation.   Inversions can thus easily form over night.

And then we have a further detail.   Fog and low clouds can form near the surface during our nights as the air cools to the dew point.  Clouds are highly reflective of solar radiation, but emit readily in the infrared.  Thus, they are cooling machines (reflect solar energy, but emit infrared radiation) and help to protect the low-level cool layer and thus the inversion. 


Inversions are very stable zones, meaning they work against vertical mixing.  Think of a a dense fluid beneath a lighter one, the dense fluid likes to stay on the bottom. Cold air is dense and warmer is lighter.

So high pressure helps produce fog and low clouds and inversions.  During the summer, nights are short enough and the sun's rays are strong enough that sufficient warming gets to the surface to heat the ground, evaporate the clouds, and destroy the inversion.  During the winter we can get stuck in inversion/low cloud conditions, sometimes for days.
Inversion of Seattle on Saturday over Seattle.  You can see the cold air layer (about 300 m thick) capped by a strong inversion.
The depth of the cool/cloudy layer is often relatively shallow:  few hundred meters is typical.    Numerical simulation of the existence and depth of such a layer is very difficult. Our models are too "mixy":  they tend to mix the layer out, and thus forecast that the weather will be less cloudy and warmer than reality.   Such a mistake was made last Saturday.  Here is a high resolution satellite image and the UW WRF model forecast for 4 PM on Saturday.  Look at central and southern Puget Sound.  Or eastern Washington.  Ooops.
WRF model cloud forecast for 4 PM
Visible Satellite Photo at 4 PM
 It is difficult to get the interplay of radiation, cloud physics, and near-surface meteorology (known as boundary-layer physics) correct in the model and it is hard to do this well subjectively.  My field has a lot of work to do to deal with this problem!