The Insane Amount of Computing Power Needed to Create Earth's Digital Twin

  • <<
  • >>

573624.jpg

 

If the current weather in Texas is any indication, future climate change effects will be catastrophic. From 2010 to 2019, extreme weather events across the globe cost the economy nearly $3 trillion. To counteract this, the European Union has set the goal of becoming climate neutral by 2050.

The initiative, Destination Earth, will begin in a few months and is expected to last for a decade. A key component of the program is the creation of a “digital twin” of Earth, which will see climate scientists and computer scientists collaborate at a previously unprecedented level. The digital twin is also expected to raise the R&D stakes for software, hardware and supercomputing.

Digital twins are typically created for industrial production and space technology. Their goal is to optimize design and operations of complex processes through a highly interconnected workflow combining a digital replica of the process with real-time observations. The Earth digital twin, for example, will combine weather and forecasting simulations with all relevant human activities, including human influence on water, food and energy management.

The concept is easier explained than executed though, as scientists acknowledge it will take an extraordinary amount of computing power and an upgrade of existing computational infrastructure.

That, however, is not to say it cannot be accomplished, as a team of scientists have recently taken the first step by authoring a roadmap of sorts, published in the February issue of Nature Computational Science.

Their preliminary experiments showed that very high-resolution simulations are not yet possible on even the fastest supercomputers in the world. Thus, the researchers propose a computing infrastructure that addresses three main questions: 1) what are the digital twin requirements?, (2) what is the most effective and sustainable software ecosystem?, and  (3) what technology and machine size can run digital twins in the near future?

Application-wise, the scientists say the digital twin would use a very high resolution, coupled Earth-system model ensemble that can be boosted by machine learning. For example, machine learning makes it possible to quicken simulations and filter out the most important information in large amounts of data, increasing efficiency and decreasing processing time and power.

The software ecosystem utilized must be flexible and based on an algorithm that is co-developed with the associated hardware. The authors emphasize this point, as the co-development of these tools will allow the infrastructure to adapt to future advancements.

“Computing hardware and software advance on vastly different time scales. The lifetime of software can be decades while high-performance hardware is usually used for less than five years. The proposed algorithmic and software investments should therefore provide utmost flexibility and openness to new, fast evolving technology,” the authors explain in their paper.

The ideal machine for Earth’s digital twin will use the latest, smallest fabrication processes utilizing silicon in order to be competitive in terms of energy consumption and performance. It will also run off GPUs (graphics processing unit) rather than traditional CPUs (central processing unit). GPU processing can perform mathematically intensive computations on very large data sets. Currently, GPU-CPU combinations are the standard for supercomputers, but the researchers say the digital twin architecture requires GPUs—and more of them.

Running one-year-per-day simulations will require a system with about 20,000 GPUs consuming a total of 20 megawatts. For comparison sake, the Piz Daint supercomputer—the most powerful in Europe—is equipped with a 2,496-core GPU and consumes 2 megawatts. For both economic and environmental reasons, the researchers say such a computer will need to be operated at a location that boasts an adequate supply of CO2-neutral generated electricity.

While the Destination Earth initiative is a 10-year program, the authors encourage action now, before it becomes too late to reach the necessary level of innovation.

“The societal challenges arising from climate change require a step-change in predictive skill that will not be reachable with incremental enhancements,” they write. “The time is ripe for making substantial investments at the interface between Earth-system and computational science.”