The graph shows the benefits of accelerated computing for weather forecasting

The increase in the frequency and severity of extreme weather and climate events could claim a million lives and cost $1.7 trillion annually by 2050, according to the Munich Reinsurance Company.

This underscores the critical need for accurate weather forecasting, especially as severe weather events such as blizzards, hurricanes and heatwaves increase. Artificial intelligence and accelerated computing are ready to help.

More than 180 weather modeling centers employ a robust high-performance computing (HPC) infrastructure to build traditional numerical weather prediction (NWP) models. These include the European Center for Medium-Range Weather Forecasting (ECMWF), which operates on 983,040 CPU cores, and the UK Met Offices supercomputer, which uses over 1.5 million CPU cores and consumes 2.7 megawatts of energy .

Rethinking HPC design

The global drive towards energy efficiency requires a rethinking of HPC system design. Accelerated computing, harnessing the power of GPUs, offers a promising and energy-efficient alternative that speeds up computations.

The graph shows the benefits of accelerated computing for weather forecasting
Left, results based on 51-member teams of ECMWF Integrated Prediction System on Intel Broadwell CPUs, 1,000-member FourCastNet teams on 4 NVIDIA A100 Tensor Core GPUs; assuming 10 modeling centers running the same expected workload. At right, the results based on the measured performance of the ICON model. CPU: 2x AMD Milan. GPU: 4x NVIDIA H100 Tensor Cores PCIe.

NVIDIA GPUs have had a significant impact on globally adopted weather models, including those of ECMWF, the Max Planck Institute for Meteorology, the German Meteorological Service and the National Center for Atmospheric Research.

GPUs improve performance up to 24x, improve energy efficiency, and reduce cost and space requirements.

To make reliable weather forecasts and climate projections a reality within energy budget constraints, we rely on algorithmic and hardware improvements where NVIDIA GPUs are an alternative to CPUs, said Oliver Fuhrer, head of numerical forecasts at MeteoSwiss. Swiss National Office of Meteorology and Climatology.

The AI ​​model increases speed and efficiency

NVIDIA’s AI-powered FourCastNet weather forecasting model delivers competitive accuracy with speed and power efficiency orders of magnitude greater than traditional methods. FourCastNet rapidly produces weekly forecasts and enables the generation of large ensembles or groups of models with slight variations in baseline conditions for extreme weather forecasting with high reliability.

For example, based on historical data, FourCastNet accurately predicted the temperatures of July 5, 2018, in Ouargla, the hottest day recorded in Algeria in Africa.

An example of efficiency, accuracy of predictions based on artificial intelligence
A visualization of real-time across Africa in July 2018 (center), surrounded by globes showing heat domes representing accurate forecasts produced by FourCastNet (ensemble members).

Using NVIDIA GPUs, FourCastNet quickly and accurately generated 1,000 ensemble members, outperforming traditional models. A dozen members accurately predicted high temperatures in Algeria based on data from three weeks before they occurred.

This marked the first time the FourCastNet team predicted a high-impact event weeks in advance, demonstrating the potential of artificial intelligence for reliable weather forecasting with lower power consumption than traditional weather models.

FourCastNet uses the latest advances in AI, such as transformer models, to connect AI and physics for game-changing results. That’s about 45,000 times faster than traditional NWP models. And when trained, FourCastNet uses 12,000 times less energy to produce a forecast than the Europe-based integrated forecasting system, a gold standard NWP model.

NVIDIA FourCastNet opens the door to the use of artificial intelligence for a wide variety of applications that will change the shape of the NWP enterprise, said Bjorn Stevens, director of the Max Planck Institute for Meteorology.

Expand what is possible

In an NVIDIA GTC session, Stevens described what is now possible with the ICON climate research tool. The Levante supercomputer, using 3,200 CPUs, can simulate 10 days of weather in 24 hours, Stevens said. In contrast, the JUWELS Booster supercomputer, using 1,200 NVIDIA A100 Tensor Core GPUs, can run 50 simulated days in the same amount of time.

Scientists are trying to study climate effects 300 years into the future, meaning systems must be 20 times faster, Stevens added. Embracing faster technology like NVIDIA H100 Tensor Core GPUs and simpler code could get us there, she said.

Researchers are now faced with the challenge of finding the optimal balance between physical modeling and machine learning to produce faster and more accurate climate predictions. An ECMWF blog published last month describes this hybrid approach, which relies on machine learning for initial predictions and physical models for data generation, testing and system refinement.

Such integration delivered with accelerated computing could lead to significant advances in weather forecasting and climate science, ushering in a new era of efficient, reliable and energy-conscious forecasting.

Learn more about how accelerated computing and artificial intelligence advance climate science through these resources:

#change #weather #Artificial #intelligence #accelerated #computing #promise #faster #efficient #predictions
Image Source : blogs.nvidia.com

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *