Thursday, May 16, 2013

Progress in Numerical Weather Prediction

Thursday, May 16, 2013
8:38 P.M.

Cliff Mass - Retrieved from UW Atmospheric Sciences Website
Many of you are familiar with Cliff Mass, an atmospheric sciences professor at the UW who has a weather blog (cliffmass.blogspot.com) and a weekly weather discussion slot on KPLU. He is very active in the public scope and gives many talks and lectures around the area, particularly ones that relate to math education, the need for a coastal radar (although not so much anymore since we now have one), severe weather of the Pacific Northwest (especially the Columbus Day Storm), and different types of modeling, with an emphasis on probabilistic ensemble forecasting where many models with slightly different initial conditions are run and the output from each ensemble "member" is analyzed. This probabilistic approach is, in my opinion, the future of forecasting, as it allows the forecasters to get an idea of the uncertainty in the forecast, the likelihood of certain specific outcomes, and an ensemble "mean" that is usually more accurate than one from a certain operational model initialized at 00z or 12z (Greenwich Mean Time).

This past year, he talked a lot about the superiority of European forecasting models over U.S. ones. This was especially apparent with Hurricane Sandy, where the European models predicted a catastrophic storm well before the American ones, which initially took the storm harmlessly eastward out to the Atlantic.

I read his blog often (and you should too), and his most recent post was a particularly noteworthy post. I'm not going to give you all the details of his post, as I don't want to take any credit for something he put in the time to wrote. Rather, I'll give you a brief summary and a link to the post.

After Hurricane Sandy, there was a lot of news in the mainstream media about how much better the European models had handled the storm than the American ones. As a result, the National Weather Service put a priority on increasing the accuracy of its medium range weather models buy obtaining more computational power. The National Weather Service expects to expand its computing power 37-fold from 70 teraflops now to 2600 teraflops by 2015. A teraflop is a trillion calculations per second... it sounds like it should be a trillion failures per second though. With this many teraflops, we'll be ahead of the European Center in terms of computational power, and hopefully behind them in number of forecast failures.

In other words, we'll have less of this...

IBM PC 5150 with keyboard and green monochrome monitor (5151), running MS-DOS 5.0 - Retrieved from Wikipedia - Photo Credit: Boffy b
... and more of this...

The IBM Blue Gene/P supercomputer installation at the Argonne Leadership Angela Yang Computing Facility located in the Argonne National Laboratory, in Lemont, Illinois, USA - Retrieved from Wikipedia - Photo Credit: Argonne National Laboratory
Here is the link to his blog: http://cliffmass.blogspot.com/2013/05/a-new-chapter-for-us-numerical-weather.html. Please read it. Thanks.

Towelie is still stuck in the tree.
Charlie

No comments:

Post a Comment