Climate researchers look into the future to project how much the planet will warm in coming decades — but they often rely on decades-old software to conduct their analyses.
This legacy software architecture is difficult to update with new methodologies that have emerged in recent years. So a consortium of researchers is starting from scratch, writing a new climate model that leverages AI, new software tools and NVIDIA GPUs.
Scientists from Caltech, MIT, the Naval Postgraduate School and NASA’s Jet Propulsion Laboratory are part of the initiative, named the Climate Modeling Alliance — or CliMA.
“Computing has advanced quite a bit since the ‘60s,” said Raffaele Ferrari, oceanography professor at MIT and principal investigator on the project. “We know much more than we did at that time, but a lot was hard-coded into climate models when they were first developed.”
Building a new climate model from the ground up allows climate researchers to better account for small-scale environmental features, including cloud cover, rainfall, sea ice and ocean turbulence.
These variables are too geographically miniscule to be precisely captured in climate models, but can be better approximated using AI. Incorporating the AI’s projections into the new climate model could reduce uncertainties by half compared to existing models.
The team is developing the new model using Julia, an MIT-developed programming language that was designed for parallelism and distributed computation, allowing the scientists to accelerate their climate model calculations using NVIDIA V100 Tensor Core GPUs onsite and on Google Cloud.
As the project progresses, the researchers plan to use supercomputers like the GPU-powered Summit system at Oak Ridge National Labs as well as commercial cloud resources to run the new climate model — which they hope to have running within the next five years.
AI Turns the Tide
Climate scientists use physics and thermodynamics equations to calculate the evolution of environmental variables like air temperature, sea level and rainfall. But it’s incredibly computationally intensive to run these calculations for the entire planet. So in existing models, researchers divide the globe into a grid of 100-square-kilometer sections.
They calculate every 100 km block independently, using mathematical approximations for smaller features like turbulent eddies in the ocean and low-lying clouds in the sky — which can measure less than one kilometer across. As a result, when stringing the grid back together into a global model, there’s a margin of uncertainty introduced in the output.
Small uncertainties can make a significant difference, especially when climate scientists are estimating for policymakers how many years it will take for average global temperature to rise by more than two degrees Celcius. Due to the current levels of uncertainty, researchers project that, with current emission levels, this threshold could be crossed as soon as 2040 — or as late as 2100.
“That’s a huge margin of uncertainty,” said Ferrari. “Anything to reduce that margin can provide a societal benefit estimated in trillions of dollars. If one knows better the likelihood of changes in rainfall patterns, for example, then everyone from civil engineers to farmers can decide what infrastructure and practices they may need to plan for.”
A Deep Dive into Ocean Data
The MIT researchers are focusing on building the ocean elements of CliMA’s new climate model. Covering around 70 percent of the planet’s surface, oceans are a major heat and carbon dioxide reservoir. To make ocean-related climate projections, scientists look at such variables as water temperature, salinity and velocity of ocean currents.
One such dynamic is turbulent streams of water that flow around in the ocean like “a lot of little storms,” Ferrari said. “If you don’t account for all that swirling motion, you strongly underestimate how the ocean is absorbing heat and carbon.”
Using GPUs, researchers can narrow the resolution of their high-resolution simulations from 100 square kilometers down to one square kilometer, dramatically reducing uncertainties. But these simulations are too expensive to directly incorporate into a climate model that looks decades into the future.
That’s where an AI model that learns from fine-resolution ocean and cloud simulations can help.
“Our goal is to run thousands of high-resolution simulations, one for each 100-by-100 kilometer block, that will resolve the small-scale physics presently not captured by climate models,” said Chris Hill, principal research engineer at MIT’s earth, atmospheric and planetary sciences department.
These high-resolution simulations produce abundant synthetic data. That data can be combined with sparser real-world measurements, creating a robust training dataset for an AI model that estimates the impact of small-scale physics like ocean turbulence and cloud patterns on large-scale climate variables.
CliMA researchers can then plug these AI tools into the new climate model software, improving the accuracy of long-term projections.
“We’re betting a lot on GPU technology to provide a boost in compute performance,” Hill said.
MIT hosted in June a weeklong GPU hackathon, where developers — including Hill’s team as well as research groups from other universities — used the CUDA parallel computing platform and the Julia programming language for projects such as ocean modeling, plasma fusion and astrophysics.
Image by Tiago Fioreze, licensed from Wikimedia Commons under Creative Commons 3.0 license.