The secret to better weather forecasts may be a dash of AI • The Register

The secret to better weather forecasts may be a dash of AI • The Register

Climate and weather modeling has long been a staple of high-performance computing, but as meteorologists look to improve the speed and accuracy of forecasts, machine learning is increasingly finding its way into the mix.

On paper Published In Nature this week, a team from Google and the European Centre for Medium-Range Weather Forecasts (ECMWF) details a new approach that uses machine learning to overcome limitations in current climate models and attempt to generate forecasts faster and more accurate than current methods.

The model, called NeuralGCM, was developed using historical weather data collected by the European Centre for Medium-Range Weather Forecasts (ECMWF), and uses neural networks to enhance traditional physics simulations similar to high-performance computing (HPC).

As Stefan Hoyer, a member of the NeuralGCM crew, wrote in a recent article: a reportMost climate models today rely on dividing the globe into cubes 50 to 100 kilometers on each side, and then simulating how air and moisture move within them based on known laws of physics.

NeuralGCM works in a similar way, but adds machine learning to track climate processes that are not necessarily well understood or that occur at a smaller scale.

“Many important climate processes, including clouds and resolution, vary on scales much smaller (from millimeters to kilometers) than the cube dimensions used in current models, and therefore cannot be accounted for in terms of physics,” Heuer wrote.

These smaller-scale phenomena have traditionally been tracked using a series of simpler submodels, called “parameters,” Hoyer explained, noting that “these simplified approximations inherently limit the accuracy of physics-based climate models.”

In other words, these parameters are not always the most reliable and can degrade the overall accuracy of the model.

The NeuralGCM model works by permuting these parameters with a neural network. Three models were trained on existing weather data collected by the European Centre for Medium-Range Weather Forecasts between 1979 and 2019 at resolutions of 0.7, 1.4 and 2.8 degrees.

The results are very promising, according to the study. Using Google’s WeatherBench2 framework, the team says NeuralGCM was able to achieve parity with current state-of-the-art forecast models for up to five days at 0.7 degrees, while forecasts for five to 15 days were most accurate at 1.4 degrees.

Meanwhile, the team found that at 2.8 degrees, the model was able to predict the average global temperature between 1980 and 2020 with an average error rate, one-third the error rate of current atmospheric models.

NeuralGCM has also proven highly competitive against more targeted models like X-SHiELD, which Hoyer explains provides much higher accuracy at the cost of being more computationally intensive.

In the face of X-SHiELD, the researchers found that the 1.4-degree NeuralGCM model was able to predict humidity and temperatures from 2020 with 15-20 percent less error. In the same test, they were able to predict tropical cyclone patterns that matched the number and intensity of cyclones observed that year.

Accelerate prediction

Not only did the team replace these parameters with neural networks, NeuralGCM was written entirely using Google JAX, a machine learning framework for transforming numerical functions for use in Python.

According to Heuer, the move to JAX had a number of benefits, including greater numerical stability during training and the ability to run the model on CPUs or GPUs. By contrast, weather models have traditionally run on CPUs, though GPUs are increasingly being used — more on that later.

Since NeuralGCM runs natively on accelerators, Google claims its system is much faster and cheaper to run.

“Our 1.4-magnitude model is more than 3,500 times faster than X-SHiELD, meaning that if researchers simulated the atmosphere for a year using X-SHiELD, it would take them 20 days compared to just eight minutes using NeuralGCM,” Heuer wrote.

Furthermore, Hoyer claims that the simulation can be run on a single TPU instead of the 13,000 CPUs needed to run X-SHiELD, and you can even run NeuralGCM on a laptop if you want.

While this model is promising, it is important to note that it is just a starting point, and Hoyer freely admits that it is not a complete climate model. However, that appears to be the long-term goal.

“We hope to eventually be able to include other aspects of the Earth’s climate system, such as the oceans and the carbon cycle, in the model. By doing so, we will allow NeuralGCM to make predictions over longer timescales, going beyond weather forecasting over days and weeks to making predictions over climate timescales,” Heuer wrote.

To support these efforts, the source code for the model and weights has been released to the public at Github Under a non-commercial license. Weather enthusiasts can enjoy this.

Machine Learning Gains Momentum in Climate Modeling

This isn’t the first time we’ve seen machine learning used in climate modeling. Nvidia’s Earth-2 climate model is another. Example On how AI and high-performance computing can combine to not only improve forecast accuracy, but also speed it up.

Announced at GTC this spring, Earth-2 is essentially a massive digital twin designed to use a combination of HPC models and AI to generate high-resolution simulations up to two kilometers in resolution.

This is possible in part thanks to a model called CorrDiff, a diffusion model that Nvidia says can generate images of weather patterns at 12.5 times the resolution, and about 1,000 times faster than other numerical models. The result is a model that’s fast and accurate enough to interest Taiwan, which is looking to the platform to improve its typhoon forecasts.

Meanwhile, more climate research centers are starting to adopt GPU-accelerated systems. Climate research is one of several areas of study targeted by the 200-petaflop Isambard-AI (FP64) system. to publish At the University of Bristol.

Earlier this year, the Euro-Mediterranean Centre for Climate Change in Lecce, Italy, held a Clicked on Lenovo announces its new Cassandra supercomputer, which will be powered by Intel Xeon Max CPUs and a small set of Nvidia H100 GPUs, which the lab aims to use to run a variety of AI-based climate simulations.

Leave a Reply

Your email address will not be published. Required fields are marked *