Forecasting models are based on physical laws governing atmospheric motion, chemical reactions and other relationships. They crunch millions of numbers that represent current weather and environmental conditions, such as temperature, pressure and wind, to predict the future state of the atmosphere. Imagine a grid that lies over the planet’s surface. Imagine another one a few hundred feet above that—and another and another, in layer after layer, all the way to the top of the stratosphere some 30 miles up. Millions of lines of code are needed to translate the billions of grid points under observation.
A typical forecast model today uses grids at the surface that run about five to 30 miles square. The smaller the squares, the higher the model’s resolution and the better it will be at detecting small-scale atmospheric changes that could spawn storms. Processing more data points, however, requires faster supercomputers.
Advances in modeling also require talented people who can integrate all these data and interpret them. Bill Lapenta, acting director of NOAA’s Environmental Modeling Center, heads that translation effort, which churns out numerical forecasts for 12, 24, 36, 48 and 72 hours ahead and beyond. Meteorologists compare NOAA’s models with others from international modeling centers to come up with the forecasts seen on the Web or the evening news.
NOAA supercomputers in Fairmont, W.Va., can process 73.1 trillion calculations a second. But Lapenta believes faster speeds are possible, which will allow the models to run at even smaller scales. For example, grids of just one mile square would enable models to simulate the small-scale conditions that catapult a routine thunderstorm or hurricane into a monster. NOAA plans to access some of the latest supercomputers at Oak Ridge National Laboratory to begin to build such models. Lapenta hopes such high-resolution models might begin to appear by 2020.
Lapenta foresees a day in the next decade when the increasing capabilities of new radars and satellites will be coupled with an evolving generation of finely detailed weather-prediction models running in real time on computers at speeds exceeding a quintillion computations a second. To make them a reality, scientists such as Lapenta are working on the mathematical, physical and biogeochemical relations that need to be encoded in a way that enables those relations to work together seamlessly.
If major NOAA investments in this “brainware” pay off, forecasters will not have to wait for a radar image to detect an actual storm before issuing a warning with 14 or 18 minutes of lead time. Instead they will be able to issue tornado, severe thunderstorm and flash-flood warnings based on highly accurate model forecasts produced well in advance, giving the public 30 to 60 minutes to take safety precautions.
Better Science, Better Decisions
With all these improvements, meteorologists such as Gary Conte in the New York City Weather Forecast Office will be able to predict more accurately, with longer lead times, weather hazards that can shut down the city, such as storms with snow and ice. Severe weather outlooks will extend beyond five days, hurricane forecasts beyond seven days, and the threat of spring floods will be known weeks in advance. This vision for a weather-ready nation is motivated by the desire to avoid the unmitigated disasters of 2011.
The goal is that by 2021 the rebuilt and thriving city of Joplin would receive a severe tornado warning more than an hour in advance. Families would have more time to gather and get to a safe room. Nursing homes and hospitals would be able to transfer residents and patients to shelter. Retailers would have time to get employees to safety and close up shop. Cell phones would thrum with multiple messages to seek shelter while local meteorologists broadcast similar warnings on television and radio. The clarion call of tornado sirens would reinforce the urgency of these warnings. As a result, even nature’s most powerful tornado would pass through town without any loss of life.