Skip to main content

HPC, weather prediction, and how you know it's going to rain

Weather prediction has gotten much better due to satellite imaging, global communications networks, and high-performance computing. Within our lifetimes, we can expect to predict hyper-local weather.

Veteran weather forecaster Jim Witt remembers the Great Blizzard of 1947, which arrived without warning and paralyzed the northeastern United States. “They had no idea anything was going to happen,” Witt recalls.

Even during the storm, which eventually dropped 26.4 inches of snow on New York City, weather bureaus were predicting a light snowfall. After missing their forecasts all day, the bureaus predicted a second storm—even as the skies were clearing up, says Witt, whose protégés became senior weather scientists at the National Center for Computational Sciences, the National Hurricane Center, NASA, and AccuWeather.

We’ve come a long way since then. In the past 30 years, weather forecasting has evolved from a loose set of largely subjective practices into a genuine science. Much of the progress can be credited to a fortuitous convergence of multiple technologies, including satellite imaging, global communications networks, and high-performance computing (HPC).

From wet fingers to high-performance computers

Until recently, weather forecasting was a manual process involving relatively crude instruments, hand-drawn maps, and sheets of numerical tables that threatened eyestrain. At best, forecasts were good for 12 hours. Today, we’ve become blasé about seven-day forecasts with 95 percent accuracy.

Although modern weather forecasting seems magical, it sits on a foundation of enormous computational power.

“Weather forecasting and high-performance computing go hand in hand,” says Peter Neilley, senior vice president, Global Forecasting Sciences, at The Weather Company. The partnership between weather forecasters and computers goes back to the days of ENIAC, one of the world’s earliest general-purpose electronic computers.

Because the physics of weather dynamics are expressed in complex partial differential equations, weather forecasting has become inseparable from HPC.

What's the future of HPC? What are the challenges on the path to exascale?

;

“Weather problems are always thirsty for more computing resources,” says Neilley. “Many of the world’s largest supercomputers are dedicated to weather forecasting problems.”

Achieving greater forecasting accuracy requires increasingly larger amounts of brute force computing.

We rely on massively parallel high-performance supercomputers with tens of thousands of CPUs with large memory and very high bandwidth for data transfers for running our operational models,” says Vijay Tallapragada of the National Oceanic and Atmospheric Administration (NOAA), which provides weather forecasts, storm warnings, navigational information, and scientific data to public, private, and academic organizations.

The computer system’s official name is the Weather and Climate Operational Supercomputing System (WCOSS), and it’s maintained by NOAA’s National Centers for Environmental Prediction. Tallapragada describes the supercomputer as “a combination of Linux clusters built by IBM and Cray with two identical systems: one for primary production and another for backup and development.”

Tallapragada heads the Modeling and Data Assimilation branch of NOAA's Environmental Modeling Center in College Park, Maryland. He’s also acting chief of NOAA’s Model Physics Group. From his perspective, what’s most impressive about the supercomputer system is its ability to deliver narrowly focused forecasts with alarmingly high degrees of resolution and granularity.

“Our global model resolution has increased from 375 square kilometers (roughly 145 square miles) in 1980 to 13 square kilometers (roughly five square miles) in 2015, which is proportional to the increase in HPC resources from 0.5 teraFLOPS to 2 petaFLOPS,” says Tallapragada.

Tallapragada and other weather scientists are looking forward to achieving even higher levels of resolution with exascale supercomputers performing quintillions of calculations every second. In case you’re counting, an exaflop is the equivalent of 1,000 petaFLOPS. Exascale computers would give scientists the ability to predict weather events in neighborhoods or small sections of communities.

“The Next Generation Global Prediction System is expected to provide high resolution [about 10 square kilometers, or 4 square miles] for global forecasts with much improved forecast accuracy by 2019,” says Tallapragada.

With improved resolution, the ability of forecasters to peer into the future also improves. For example, the 1993 Storm of the Century was predicted five days before it struck the eastern United States. Superstorm Sandy, which also hit the eastern U.S., was forecast nine days before it occurred in 2012.

“If you look back over the past five or six decades, the forecasts are getting better at roughly a day per decade,” says Neilley. In other words, today’s five-day forecasts are as accurate as the four-day forecasts made in 2007 and the three-day forecasts made in 1997. “There’s been a fairly steady improvement over the recent decades, and it’s heavily attributable to the improvement in supercomputing capabilities.”

When quantum computing becomes practical, supercomputer performance would make it possible to predict micro-meteorological events, such as the formation of individual clouds or wind eddies. Within our lifetimes, we’ll have the capability for predicting the weather in our backyards and on our rooftops.

Until then, we’ll experience steady improvement in weather forecasting, thanks in part to new satellites such as the Geostationary Operational Environmental Satellite (GOES) 16 and the Joint Polar Satellite System (JPSS). “Those newer satellites will greatly enhance our observations and improve our forecasts far beyond what we have today,” says Tallapragada.

More observations are important because they feed more data into the sophisticated models used by organizations like NOAA to predict the weather. The more data you have, the more accurate the model.

“Nearly everything NOAA does relies on environmental modeling, including the forecasts and warnings on which the American public depends,” he says. “NOAA’s high-performance computing assets are critical to the success of our environmental modeling enterprise and to ongoing improvements in our operational forecasts and warnings.”

Bright forecasts for weather prediction

Unquestionably, weather forecasting has been propelled forward by trends such as big data and massively parallel computing. So it seems fair to ask why weather scientists aren’t warming up to artificial intelligence and machine learning. Part of the problem is data—there simply isn’t enough data on past weather to leverage AI’s predictive potential.

“We don’t have reliable records going back thousands of years. If you want to predict a 1,000-year flood, you need a 1,000 years of accurate data for the AI to work through. We don’t have that kind of data,” he says.

Satellites are the best source of weather data, but the first weather satellites were launched only a few decades ago. Not enough time has elapsed for the satellites to gather the quantities of information required for a serious AI effort.

“No matter how much training you provide, the AI and machine learning techniques are falling short,” says Tallapragada. “Meteorological science as a whole is mostly about nonlinear systems interacting with each other. What happened yesterday is not necessarily an indication of what’s going to happen today or tomorrow. Data from the past is not truly reflective of our ever-changing earth atmosphere.”

Despite the hurdles, weather scientists still have high hopes for cognitive learning techniques. “There is a huge potential for the application of AI and machine learning techniques to post-processing and calibration [bias correction] of model output for better forecast guidance,” says Tallapragada. “We plan to explore the use of AI and ML for quality control of observational data, re-analysis, and re-forecasts, and for creating forecast products based on blends of various operational models.”

From Witt’s perspective, there’s still a need for basic human intuition in the process of weather forecasting, and he doesn’t mind second-guessing supercomputers when he thinks a prediction might be off base. But as the computer models become increasingly accurate, it really doesn’t pay to bet against them.

“People used to say the weatherman was always wrong,” he says. “The weatherman isn’t wrong that much anymore. Computers have made a huge difference. It’s really like night and day.”

HPC and the weather: Lessons for leaders

  • Big data is less of a factor in weather prediction than you may imagine, because we only have data going back a few centuries. That's a lot, but not immense.
  • Weather prediction stresses computational resources. Today, weather scientists depend on massively parallel high-performance supercomputers using tens of thousands of CPUs, lots of memory, and high bandwidth for data transfers.
  • New satellites, AI, and machine learning may make things even better—though not in the immediate forecast.

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.