This has been quite the year for climate science. Extreme weather events made headlines year-round, including exceptional heatwaves, floods, and fires driven by droughts. Two leading climate scientists, Suki Manabe and Klaus Hasselmann, won the 2021 Nobel Prize for Physics. World leaders finally came together in Glasgow for the COP26 meeting. There is little doubt that human-induced climate change is an imminent threat, and the pressure to act is mounting on government and industry.
And yet all our plans to mitigate and adapt have one glaring omission: They are not based on the best possible scientific predictions. This is not because we don’t understand the physics of climate change. Instead, the major obstacle is, rather embarrassingly, lack of human and computing resources. Today’s climate models are developed and maintained by scientists at national institutions that do not have the budgets to develop these models to their full potential. They do not have computers that are powerful enough to run the models at high resolution. As a result, current-generation climate models suffer significant shortcomings and uncertainties.
For example, a key question for mitigation policy is how quickly emissions need to be cut. Relying on negative emissions later in the century will be completely ineffective if we have passed one or more climatic tipping points—for example when some ocean circulations have shut down irreversibly. However, we can’t currently tell how high the risk of passing such tipping points is because current climate models do not properly represent possible tipping points.
It is similar for climate adaptation: current models just aren’t good enough. None of the extreme events of 2021 can be simulated in current climate models, because the events were simply too extreme for the models. How can a country prioritize its spending without knowing which is the more pressing threat: increased flooding and storms, or increased heat waves and drought? And yet, for most of the world, current climate models do not even agree on whether rainfall will increase or decrease as a result of climate change, never mind how large the change in rainfall will be.
To overcome these limitations, we advocate here the establishment of a federated international institute for climate prediction, much like CERN, the multinational collaborative particle physics laboratory. The institute would comprise several hubs in different countries, each with dedicated exaflop (one billion billion calculations per second) supercomputing facilities.
The centerpiece of the institute would be the creation of a small number of ultrahigh-resolution climate models. In this way, scientists around the world could work collaboratively to develop a multimodel ensemble prediction system with unprecedented spatial and temporal resolution. With more accurate predictions, we would know better how to adapt to and mitigate the effects of climate change. (We would also be able to make much more effective use of observational climate data.) Indeed, another exceptionally successful international institute is the European Center for Medium-Range Weather Forecasts, producing the most skillful weather predictions in the world, out to about two weeks ahead.
A federated international institute for climate change would develop global climate models with a horizontal resolution of about one kilometer (similar to that now used for weather forecasts for one or two days ahead), compared to the current capability of around 100 km. A federated institute comprising, say, six hubs around the world may cost in total little more than $1.5 billion to $2 billion a year, shared amongst nations of the world. That’s a small amount for predictions that are necessary to guide the investment of sums that will easily go into the trillions in the coming decades.
One of us (T.P.) has been pressing this issue of improving our predictive power for more than a decade now, yet we are still stuck with models that underperform simply due to computational limits. Why is this?
Some scientists feel that the diversity of national models is both necessary and sufficient to adequately quantify climate risk and that by focusing on a small number of high-resolution models we risk diluting this diversity.
However, current state-of-the-art climate models all have approximately the same resolution, a grid spacing around 100 km. Anything below that scale—be that ocean eddies, clouds or the effects of individual mountains—is described by highly simplified deterministic formulas known as “subgrid parametrizations.”
But the theory of our climate—a nonlinear, multiscale turbulent system—tells us that the very concept of parametrization is itself a source of error. The diversity of current models therefore does not—cannot—represent uncertainty, given the structural error associated with the assumption that subgrid processes can be parametrized in the first place.
A model with a resolution on the kilometer scale, as could be developed at an international institute, would not entirely eliminate this problem of structural error, but it would significantly alleviate it. This is because at the kilometer scale, key processes like convective clouds can be more accurately represented by the laws of physics. On top of this, neither the effects of Earth’s topography, nor the mesoscale eddies that help shape ocean circulations such as the Gulf Stream, would need to be parametrized. For processes—like cloud microphysics and boundary-layer turbulence (which still need to be parametrized at one-kilometer resolution) stochastic parametrizations, which have proved to work well for operational weather prediction, can be used.
The Destination Earth project, funded by the E.U. Green Deal, will shortly be undertaking important work in developing a prototype kilometer-scale climate model. Other projects to develop kilometer-scale models are beginning around the world. It is now time to plan for a federated international institute for climate-change prediction that could put these developments to best use. Just as CERN allows high-energy physicists to do experiments that no single nation is able or willing to do, the international climate institute would allow us to make the type of coordinated ensemble prediction that is currently impossible as a result of human and computing constraints.
With output from this ensemble linked to economic, agronomic, health, hydrologic and other impact models, this institute could become the go-to place for scientists of other disciplines, government, industry, science communicators and the general public to look for the best and most up-to-date information. It would not make national climate models and institutions obsolete; CERN has not eradicated national particle physics institutes. Rather, it would merge the insights of national institutes and allow them to have maximum impact.
Climate change is a global problem that requires a global solution. It is time that climate scientists from around the world join forces and come together in an international initiative for climate modeling.