Shallow clouds formed by subtle vortices as observed in nature. Researchers are using advanced computing to add higher-resolution cloud dynamics to global simulations. Credit: Creative Commons
We hear a lot about how climate change will change the earth, the sea and the ice. But how will it affect the clouds?
“Low clouds could dry out and shrink like ice sheets,” said Michael Pritchard, a professor of earth science at UC Irvine. “Or they could gain weight and become more reflective.”
These two scenarios would result in a very different future climate. And that, says Pritchard, is part of the problem.
“If you ask two different climate models, what will the future be like if we add much more CO2You get two very different answers. And the main reason is the way clouds are included in climate models. ”
No one denies that clouds and aerosols – pieces of soot and dust that form the nuclei of cloud droplets – are an important part of the climate equation. The problem is that these phenomena occur in a length and time scale that today’s models cannot approach. They are therefore included in the models through various approximations.
Analyzes of global climate models constantly show that clouds are the biggest source of uncertainty and instability.
Remodeling community codes
While the most advanced global climate model in the United States seeks to approximate a global resolution of 4 kilometers, Pritchard estimates that models need a resolution of at least 100 meters to capture the fine-scale turbulent swirls that make up shallow cloud systems – 40 times greater resolution in each direction. According to Moore’s Law, it could take until 2060 for computing power to capture this level of detail.
Pritchard is working to close this glaring gap by dividing the problem of climate modeling into two parts: a coarse-grained planetary model with a lower resolution (100 km) and many small fields with a resolution of 100 to 200 meters. The two simulations run independently and then exchange data every 30 minutes to ensure that neither simulation gets out of the way or becomes unrealistic.
His team announced the results of this effort in Journal of Advances in Modeling Earth Systems in April 2022.
This method of climate simulation, called the “Multiscale Modeling Framework (IMF)”, has existed since 2000 and has long been optional under the Community Earth System Model (CESM) developed at the National Center for Atmospheric Research. The idea has recently been enjoying a renaissance at the Department of Energy, where researchers from the Energy Exascale Earth System Model (E3SM) are pushing it to new computing frontiers as part of the Exascale Computing Project. Pritchard’s co-author Walter Hannah of Lawrence Livermore National Laboratory is at the forefront of this effort.
“The model deals with the most difficult problem – modeling the entire planet,” Pritchard explained. “It has thousands of small micromodels that capture things like realistic shallow cloud creation that only appears in very high resolution.”
“The Multiscale Modeling Framework approach is also ideal for upcoming GPU-based DOE exascale computers,” said Mark Taylor, principal computer scientist at DOE’s Energy Exascale Earth System Model (E3SM) and a researcher at Sandia National Laboratories. “Each GPU has the power to run hundreds of micromodels while still matching the throughput of a lower-resolution coarse-grained planetary model.”
Pritchard’s research and new approach are made possible in part by the NSF-funded Fronter supercomputer at the Texas Advanced Computing Center (TACC). Pritchard, the world’s fastest university supercomputer, can run its models on the Front in a time and length scale available on only a few systems in the United States and test their potential for cloud modeling.
“We’ve developed a way for the supercomputer to best divide the cloud physics simulation work into different parts of the world that deserve different resolutions – so that it runs much faster,” the team wrote.
Atmospheric simulation in this way provides Pritchard with the resolution needed to capture the physical processes and turbulent vortices involved in cloud formation. Researchers have shown that a multi-model approach does not cause unwanted side effects, even where patches that use different grid structures for cloud resolution meet.
“We were happy, so we see the differences were small,” he said. “This will give new flexibility to all climate model users who want to focus high definition in different locations.”
Disconnecting and reconnecting the various scales of the CESM model was a challenge that Pritchard’s team overcame. Another involved reprogramming the model to take advantage of the ever-increasing number of processors available on modern supercomputer systems.
Pritchard and his team – UCI Postdoctoral Fellow Liran Peng and Washington University researcher Peter Blossey – addressed this by dividing the internal domains of embedded CESM cloud models into smaller parts that could be solved in parallel using MPI or messaging interfaces – and how messages are exchanged between multiple computers running a parallel program through distributed memory – and organizing these calculations to use many more processors.
“It already seems to provide four times the acceleration with great efficiency. That means I can be four times as ambitious about my cloud models, ”he said. “I’m really optimistic that this dream of regionalization and disintegration of the MPI is leading to a completely different landscape of what is possible.”
Machine learning clouds
Pritchard sees another promising approach in machine learning that his team has been researching since 2017. “I was very provoked by how efficiently the dumb leaf of neurons can reproduce these partial differential equations,” Pritchard said.
Pritchard’s research and new approach are made possible in part by the NSF-funded Fronter supercomputer at TACC. Pritchard, the world’s fastest university supercomputer, can run its models on the Front in a time and length scale available on only a few systems in the United States and test their potential for cloud modeling.
In a document presented last fall by Pritchard, lead author Tom Beucler of the UCI, and others describe a machine learning approach that successfully predicts atmospheric conditions even in untreated climates where others have tried.
This “climate invariant” model incorporates physical knowledge of climate processes into machine learning algorithms. Their study, which used Stampede2 at TACC, Cheyenne at the National Center for Atmospheric Research, and Expansion at the San Diego Supercomputing Center, showed that machine learning can maintain high accuracy in a wide range of climates and geographies.
“If machine learning high-resolution cloud physics ever succeeded, it would change everything about how we do climate simulations,” Pritchard said. “I’m interested in how a machine learning approach can succeed in complex settings reproducibly and reliably.”
Pritchard is in a good position to do so. He is a member of the executive committee of the NSF Center for Learning the Earth with Artificial Intelligence and Physics or LEAP – a new science and technology center funded by the NSF in 2021 under the leadership of his longtime collaborator, Professor Pierre Gentin. LEAP brings together climate and data scientists to reduce the extent of uncertainty in climate modeling and to provide more accurate and actionable climate projections that have an immediate societal impact.
“All the research I’ve done before is what I’d call ‘limited throughput,'” Pritchard said. “My task was to create 10 to 100-year-old simulations. That limited all my grid choices. However, if the goal is to create short simulations for training machine learning models, it is a different environment. ”
Pritchard hopes to use the results of his 50-meter built-in models soon to begin building a large training library. “It’s a really nice machine learning dataset.”
But will AI mature fast enough? Time is of the essence in determining the fate of the clouds.
“If these clouds recede, as the ice sheets do, and reveal darker surfaces, it will exacerbate global warming and all the dangers associated with it. But if they do the opposite of the ice sheet and thicken, which they could, it’s less dangerous. Some have estimated it to be a trillion-dollar affair for society. And that’s been discussed for a long time, “Pritchard said.
Simulation after simulation, federally funded supercomputers help Pritchard and others find the answer to this critical question.
“I’m torn between a sincere gratitude for America’s national computing infrastructure, which is so incredible that it helps us develop and operate climate models,” Pritchard said. this problem.”
AI speeds up climate calculations
Liran Peng et al., Intense Physics Calculations Balancing for embedding high-resolution regionalized cloud models in E3SM and CESM climate models, Journal of Advances in Modeling Earth Systems (2022). DOI: 10.1029 / 2021MS002841
It provides
Texas Advanced Computing Center
Citation:
A cloudless future? A mystery at the heart of climate forecasts (2022, May 31)
downloaded May 31, 2022
from https://phys.org/news/2022-05-cloudless-future-mystery-heart-climate.html
This document is subject to copyright. Except for any honest conduct for the purpose of private study or research no
part may be reproduced without written permission. The content is provided for informational purposes only.
#cloudless #future #mystery #heart #climate #forecasts #Verve #times