Skip to main content

Google’s new AI model means the outlook for weather forecasting is bright

A new AI-powered weather forecasting model can do the job with unprecedented accuracy and significantly faster than current technology.

Built by Google DeepMind — the web giant’s AI-focused lab — GraphCast looks set to revolutionize the process of predicting weather.

Recommended Videos

GraphCast can forecast weather up to 10 days in advance “more accurately and much faster than the industry gold-standard weather simulation system — the High Resolution Forecast (HRES), produced by the European Centre for Medium-Range Weather Forecasts (ECMWF),” Google DeepMind said in a post on Tuesday.

Please enable Javascript to view this content

Notably, the model can also offer earlier warnings of extreme weather events and predict the movement of cyclones more accurately, giving the authorities and residents more time to prepare for damaging storms, potentially saving lives in the process.

When Hurricane Lee struck eastern Canada in September, GraphCast accurately forecasted that it would make landfall in Nova Scotia nine days before it did so, while traditional forecasts only made the same prediction about six days in advance.

GraphCast has been trained on four decades of weather data, enabling it to learn the cause-and-effect relationships behind Earth’s weather systems, the DeepMind team said.

Remarkably, GraphCast takes less than 60 seconds to create a 10-day forecast, making it way faster than the conventional approach used by HRES, which, according to the team, “can take hours of computation in a supercomputer with hundreds of machines.”

In a comparison of the two systems, GraphCast gave more accurate forecasts on more than 90% of 1,380 test variables and forecast lead times compared to HRES.

“When we limited the evaluation to the troposphere, the 6- to 20-kilometer high region of the atmosphere nearest to Earth’s surface where accurate forecasting is most important, our model outperformed HRES on 99.7% of the test variables for future weather,” the team said.

As the weather patterns evolve in Earth’s ever-changing climate, GraphCast will only improve as it’s fed higher-quality data.

The team is open-sourcing GraphCast’s model code to give scientists and forecasters access to the technology. This will allow them to tailor it for specific weather phenomena and optimize it for different parts of the world. The ECMWF is already trying out the model.

A study published by Science on Tuesday offers a more detailed look at GraphCast. In it, the team says the model “should not be regarded as a replacement for traditional weather forecasting methods, which have been developed for decades, rigorously tested in many real-world contexts, and offer many features we have not yet explored,” adding that GraphCast “has potential to complement and improve the current best methods.”

Still, it believes that deploying AI for weather forecasting “will benefit billions of people in their everyday lives,” explaining that alongside weather prediction, it also wants to use the technology to gain a greater understanding of the broader patterns of our climate. “By developing new tools and accelerating research, we hope AI can empower the global community to tackle our greatest environmental challenges,” it said.

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
Meta and Google made AI news this week. Here were the biggest announcements
Ray-Ban Meta Smart Glasses will be available in clear frames.

From Meta's AI-empowered AR glasses to its new Natural Voice Interactions feature to Google's AlphaChip breakthrough and ChromaLock's chatbot-on-a-graphing calculator mod, this week has been packed with jaw-dropping developments in the AI space. Here are a few of the biggest headlines.

Google taught an AI to design computer chips
Deciding how and where all the bits and bobs go into today's leading-edge computer chips is a massive undertaking, often requiring agonizingly precise work before fabrication can even begin. Or it did, at least, before Google released its AlphaChip AI this week. Similar to AlphaFold, which generates potential protein structures for drug discovery, AlphaChip uses reinforcement learning to generate new chip designs in a matter of hours, rather than months. The company has reportedly been using the AI to design layouts for the past three generations of Google’s Tensor Processing Units (TPUs), and is now sharing the technology with companies like MediaTek, which builds chipsets for mobile phones and other handheld devices.

Read more
Google will begin labeling AI-generated images in Search
Google Search on mobile

AI-generated images have become increasingly predominant in the results of Google searches in recent months, crowding out legitimate results and making it harder for users to find what they're actually looking for. In response, Google announced on Tuesday that it will begin labeling AI-generated and AI-edited image search results in the coming months.

The company will flag such content through the “About this image” window and it will be applied to Search, Google Lens, and Android's Circle to Search features. Google is also applying the technology to its ad services and is considering adding a similar flag to YouTube videos, but will "have more updates on that later in the year," per the announcement post.

Read more
How you can try OpenAI’s new o1-preview model for yourself
The openAI o1 logo

Despite months of rumored development, OpenAI's release of its Project Strawberry last week came as something of a surprise, with many analysts believing the model wouldn't be ready for weeks at least, if not later in the fall.

The new o1-preview model, and its o1-mini counterpart, are already available for use and evaluation, here's how to get access for yourself.

Read more