Articles | Volume 18, issue 20
https://doi.org/10.5194/gmd-18-7735-2025
https://doi.org/10.5194/gmd-18-7735-2025
Development and technical paper
 | Highlight paper
 | 
23 Oct 2025
Development and technical paper | Highlight paper |  | 23 Oct 2025

nextGEMS: entering the era of kilometer-scale Earth system modeling

Hans Segura, Xabier Pedruzo-Bagazgoitia, Philipp Weiss, Sebastian K. Müller, Thomas Rackow, Junhong Lee, Edgar Dolores-Tesillos, Imme Benedict, Matthias Aengenheyster, Razvan Aguridan, Gabriele Arduini, Alexander J. Baker, Jiawei Bao, Swantje Bastin, Eulàlia Baulenas, Tobias Becker, Sebastian Beyer, Hendryk Bockelmann, Nils Brüggemann, Lukas Brunner, Suvarchal K. Cheedela, Sushant Das, Jasper Denissen, Ian Dragaud, Piotr Dziekan, Madeleine Ekblom, Jan Frederik Engels, Monika Esch, Richard Forbes, Claudia Frauen, Lilli Freischem, Diego García-Maroto, Philipp Geier, Paul Gierz, Álvaro González-Cervera, Katherine Grayson, Matthew Griffith, Oliver Gutjahr, Helmuth Haak, Ioan Hadade, Kerstin Haslehner, Shabeh ul Hasson, Jan Hegewald, Lukas Kluft, Aleksei Koldunov, Nikolay Koldunov, Tobias Kölling, Shunya Koseki, Sergey Kosukhin, Josh Kousal, Peter Kuma, Arjun U. Kumar, Rumeng Li, Nicolas Maury, Maximilian Meindl, Sebastian Milinski, Kristian Mogensen, Bimochan Niraula, Jakub Nowak, Divya Sri Praturi, Ulrike Proske, Dian Putrasahan, René Redler, David Santuy, Domokos Sármány, Reiner Schnur, Patrick Scholz, Dmitry Sidorenko, Dorian Spät, Birgit Sützl, Daisuke Takasuka, Adrian Tompkins, Alejandro Uribe, Mirco Valentini, Menno Veerman, Aiko Voigt, Sarah Warnau, Fabian Wachsmann, Marta Wacławczyk, Nils Wedi, Karl-Hermann Wieners, Jonathan Wille, Marius Winkler, Yuting Wu, Florian Ziemen, Janos Zimmermann, Frida A.-M. Bender, Dragana Bojovic, Sandrine Bony, Simona Bordoni, Patrice Brehmer, Marcus Dengler, Emanuel Dutra, Saliou Faye, Erich Fischer, Chiel van Heerwaarden, Cathy Hohenegger, Heikki Järvinen, Markus Jochum, Thomas Jung, Johann H. Jungclaus, Noel S. Keenlyside, Daniel Klocke, Heike Konow, Martina Klose, Szymon Malinowski, Olivia Martius, Thorsten Mauritsen, Juan Pedro Mellado, Theresa Mieslinger, Elsa Mohino, Hanna Pawłowska, Karsten Peters-von Gehlen, Abdoulaye Sarré, Pajam Sobhani, Philip Stier, Lauri Tuppi, Pier Luigi Vidale, Irina Sandu, and Bjorn Stevens
Video abstract
Abstract

The Next Generation of Earth Modeling Systems (nextGEMS) project aimed to produce multidecadal climate simulations, for the first time, with resolved kilometer-scale (km-scale) processes in the ocean, land, and atmosphere. In only 3 years, nextGEMS achieved this milestone with the two km-scale Earth system models, ICOsahedral Non-hydrostatic model (ICON) and Integrated Forecasting System coupled to the Finite-volumE Sea ice-Ocean Model (IFS-FESOM). nextGEMS was based on three cornerstones: (1) developing km-scale Earth system models with small errors in the energy and water balance, (2) performing km-scale climate simulations with a throughput greater than 1 simulated year per day, and (3) facilitating new workflows for an efficient analysis of the large simulations with common data structures and output variables. These cornerstones shaped the timeline of nextGEMS, divided into four cycles. Each cycle marked the release of a new configuration of ICON and IFS-FESOM, which were evaluated at hackathons. The hackathon participants included experts from climate science, software engineering, and high-performance computing as well as users from the energy and agricultural sectors. The continuous efforts over the four cycles allowed us to produce 30-year simulations with ICON and IFS-FESOM, spanning the period 2020–2049 under the SSP3-7.0 scenario. The throughput was about 500 simulated days per day on the Levante supercomputer of the German Climate Computing Center (DKRZ). The simulations employed a horizontal grid of about 5 km resolution in the ocean and 10 km resolution in the atmosphere and land. Aside from this technical achievement, the simulations allowed us to gain new insights into the realism of ICON and IFS-FESOM. Beyond its time frame, nextGEMS builds the foundation of the Climate Change Adaptation Digital Twin developed in the Destination Earth initiative and paves the way for future European research on climate change.

Share
1 Introduction

The advent of exascale supercomputers and progress in numerical modeling opens the door to a new way of simulating our Earth system (Schär et al.2020; Slingo et al.2022). Several international initiatives aim to represent kilometer-scale (km-scale) processes explicitly using horizontal grid spacings equal to or finer than 10 km globally in the atmosphere (e.g., Satoh et al.2008), land (e.g., Kollet and Maxwell2006), and ocean (e.g., Maltrud and McClean2005). Such models or simulations are referred to as “km-scale models” or “km-scale simulations” in this paper. Representing km-scale processes explicitly makes it possible to simulate more accurately the horizontal and vertical transfer of mass and energy and the circulation that it entails. Naturally, the finer the horizontal grid spacing, the better km-scale processes are represented. In practice, km-scale models bring climate simulations to a level of granularity that has long been proven necessary to understand regional climate impact and support climate adaptation and mitigation.

Deep convective motions in the atmosphere, for example, redistribute moisture and energy, influencing the tropical vertical temperature profile and the global hydrological cycle (Kuang2010; Prein et al.2017; Tian and Kuang2019; Bao et al.2024). Km-scale atmospheric simulations, also referred to as storm-resolving simulations, represent deep convection, capturing most of the characteristics of mesoscale convective systems (Peters et al.2019; Prein et al.2020; Becker et al.2021), convectively coupled equatorial waves (Miura et al.2007; Holloway et al.2013; Senf et al.2018), and tropical cyclones (Gentry and Lackmann2010; Judt et al.2021; Baker et al.2024). Km-scale simulations also provide a more detailed representation of the land in terms of surface topography and heterogeneity and its impact on local-, regional-, and large-scale circulations (Sandu et al.2019). Previous studies showed that such simulations improve the representation of atmospheric blockings (Woollings et al.2018; Schiemann et al.2020), weaken the soil moisture–precipitation feedback (Lee and Hohenegger2024), and impact the circulations generated by surface–radiation interactions such as land–sea breezes (Birch et al.2015). Km-scale oceanic simulations, also referred to as eddy-resolving simulations, represent mesoscale eddies (Hewitt et al.2022), increasing the global kinetic energy (Chassignet et al.2020) and vertical transport of heat (Griffies et al.2015). Previous studies showed that the higher resolution also impacts the internal variability of the ocean (Penduff et al.2010), the timing of Antarctic sea ice decreases, and the magnitude of the projected sea level rises (van Westen and Dijkstra2021; Rackow et al.2022).

The early insights gained from storm- and eddy-resolving simulations were a strong motivation for the Next Generation of Earth Modeling Systems (nextGEMS) project funded by the European Horizon 2020 program. nextGEMS aimed to build the next generation of km-scale Earth system models, namely the ICOsahedral Non-hydrostatic model (ICON; Hohenegger et al.2023) and the Integrated Forecasting System (IFS; Rackow et al.2025b). The latter can be run together with either the Finite-VolumE Sea ice–Ocean Model (FESOM; Scholz et al.2019), used in this project, or the Nucleus for European Modelling of the Ocean model (NEMO; Madec and the NEMO System Team2023).

While ICON and IFS-FESOM both represent km-scale processes by using horizontal grid spacings of 10 km or finer, their history and philosophy differ. ICON relies on a simple framework and aims to minimize the use of parameterization schemes. In the atmosphere, it parameterizes only small-scale processes related to turbulence, cloud microphysics, and radiation and does not employ any scheme for convection (Hohenegger et al.2023). The atmospheric component of IFS-FESOM, on the other hand, operates as a weather model and incorporates years of model tuning to obtain accurate forecasts. In the atmosphere, it employs sophisticated parameterization schemes for various processes including convection (ECMWF2023b; Rackow et al.2025b). The two approaches of ICON and IFS-FESOM are complementary and allow us to examine whether explicitly representing km-scale processes with horizontal grid spacings of 10 km or finer is enough to capture the main features of our climate and to what extent the simulation quality depends on the ability to fine-tune the remaining small scales.

nextGEMS had the visionary goal to produce and analyze, for the first time, multidecadal km-scale climate simulations with coupled atmosphere, land, and ocean under a Shared Socioeconomic Pathway (SSP) scenario. To facilitate the model development, nextGEMS explored new ways to foster the collaboration of project participants with expertise in software engineering, climate modeling, and climate physics. The timeline of nextGEMS was divided into four cycles. Each cycle marked the release of new simulations with ICON and IFS-FESOM, which were prepared for and analyzed at large hackathons with over 100 participants. Two main problems had to be solved to produce multidecadal km-scale simulations. First, we had to improve the computational throughput and simplify the analysis of simulation data. Second, we had to achieve simulations with an energetically consistent climate, i.e., a near-stationary climate with a top-of-atmosphere (TOA) energy balance close to zero with global conservation of mass and energy and no near-surface temperature drift.

nextGEMS achieved this visionary goal in only 3 years. In Cycle 4, we produced km-scale simulations of our climate system over multiple decades with a competitive throughput of about 500 simulated days per day (SDPD). Here, we document these simulations and review the evolution of both models, ICON and IFS-FESOM. Throughout the project, we gained knowledge of technical and scientific aspects and encountered positive and negative surprises when analyzing the simulation data. To create a storyline, we translated that learning process into four questions:

  1. Radiation balance (Sect. 4.1). Can we adjust the radiative properties and formation mechanisms of clouds in order to correct the global radiation balance and limit the drifts in the global surface temperature?

  2. Key features of mean climate (Sect. 4.2). Are key features of the mean climate correctly represented in a km-scale simulation with an energetically consistent climate?

  3. Local- to synoptic-scale phenomena (Sect. 4.3). Does an energetically consistent climate constrain the patterns of local-, meso- and synoptic-scale phenomena?

  4. Timescales of regional patterns (Sect. 4.4). Over what simulation times do regional patterns emerge?

While climate simulations with a horizontal grid spacing of 10 km have been analyzed for this paper, nextGEMS also performed simulations with a horizontal grid spacing of 2.8 and 5 km integrated over shorter time periods. The lessons learned in nextGEMS are transferred to other projects, which conduct climate simulations with horizontal grid spacings up to 1.25 km. In that sense, nextGEMS is only the first step in a new era of km-scale climate simulations.

This paper is structured as follows. In Sect. 2, we provide an overview of the project concept and models, ICON and IFS-FESOM. In Sect. 3, we report the main changes made in each model over the four cycles. In Sect. 4, we discuss the four questions stated above and examine the realism of our km-scale simulations. In Sects. 5 and 6, we summarize the development of the past 3 years and provide an outlook for the future of nextGEMS.

2 nextGEMS concept and models

2.1 Concept

nextGEMS followed an innovative approach to create an integrated community of experts from different domains, including climate science, climate modeling, and high-performance computing (HPC). The model development was structured into four cycles from October 2021 to March 2024. The cycles had an average length of about 8 months and facilitated the fast-paced evolution of the two models, ICON and IFS-FESOM. Each cycle was marked by a coordinated set of simulations with both models and by a hackathon, where more than 100 project participants came together for a week of interactive coding, debugging, plotting, and technical and scientific discussions. The hackathons turned the model development into a collaborative endeavor, with close interaction between the modeling centers and their partner institutions in 14 countries. The hackathons were also used as opportunities to engage with industry stakeholders and show the potential of km-scale climate simulations for applications such as wind and solar energy or agriculture and fisheries. The interaction patterns between participants were analyzed across hackathons, revealing an increase in inter-institutional cooperation and cross-disciplinary exchange over the 3 years. The outcome and feedback from the hackathons were also critical in the preparation of the next cycle, always aiming to produce more scientifically sound and computationally efficient simulations over longer periods. Figure 1 shows the timeline of the whole project from Cycles 1 to 4. The simulation times evolved from a few weeks with throughputs of about 10 SDPD to 3 decades with throughputs of about 500 SDPD.

https://gmd.copernicus.org/articles/18/7735/2025/gmd-18-7735-2025-f01

Figure 1Timeline of nextGEMS from Cycles 1 to 4. The vertical axis shows the development cycles, including hackathons and milestones, and the horizontal axis shows the simulated years, including throughputs in simulated days per day (SDPD).

Download

The simulations were produced and stored on the HPC-system Levante (HLRE-4 Levante2024). The participants of hackathons worked directly on Levante instead of copying large amounts of simulation data to individual systems. We took several steps to make the data analysis user-friendly and model-agnostic. Most simulation data were provided via Intake catalogs, a Python package for searching and loading data (https://github.com/intake/intake, last access: 2 October 2025). From Cycle 2 onwards, the simulations were published on the World Data Center for Climate (WDCC). From Cycle 3 onwards, the simulations were unified into a single Intake catalog (https://data.nextgems-h2020.eu/catalog.yaml, last access: 2 October 2025). The output of ICON was stored as Zarr datasets (https://github.com/zarr-developers/zarr-python, last access: 2 October 2025), and the output of IFS-FESOM was indexed using gribscan (Kölling et al.2024), a tool to scan GRIdded Binary (GRIB) files and create Zarr-compatible indices such that users can access the underlying GRIB files as Zarr datasets. In addition, example notebooks for an initial analysis were provided via the easy.gems website (https://easy.gems.dkrz.de, last access: 2 October 2025, and https://github.com/nextGEMS, last access: 2 October 2025).

The two models developed in nextGEMS, ICON and IFS-FESOM, can simulate the atmosphere, ocean and sea ice, and land and their interactions at km scales. Moreover, they can include additional components of the Earth system, such as the carbon cycle (and aerosols). For this reason, we refer to ICON and IFS-FESOM as km-scale Earth system models, even though the simulations presented here were conducted without any carbon cycle (or aerosol) module. Figure 2 shows an overview of how these components and their interactions are represented in each model. The two models are complementary. While both models simulate the Earth system, ICON can be characterized as a research model with a minimal set of parameterization schemes, whereas IFS-FESOM can be characterized as an operational model with elaborate parameterization schemes and tuning methods. The key features of both models are described in the next two sections and the developments over the four cycles are summarized in Sect. 3 and Appendix A.

https://gmd.copernicus.org/articles/18/7735/2025/gmd-18-7735-2025-f02

Figure 2Overview of the Earth system models ICON and IFS-FESOM. Differences between the two models are highlighted in red.

Download

2.2 Models

2.2.1 ICON

The ICOsahedral Non-hydrostatic model (ICON) is developed by the ICON Partnership. A detailed first description of the km-scale version was presented by Hohenegger et al. (2023). In ICON, the Earth system is divided into three coupled components: atmosphere, land, and ocean. All three components are discretized with an icosahedral-triangular C grid (Zängl et al.2015).

The atmosphere is modeled with non-hydrostatic equations (Zängl et al.2015). It includes tendencies from parameterized processes: turbulence (Dipankar et al.2015; Lee et al.2022), radiation (Pincus et al.2019), and cloud microphysics (Baldauf et al.2011). It is discretized with terrain-following levels in the vertical (Leuenberger et al.2010). In addition, it includes the one-moment aerosol module HAM-lite developed in nextGEMS (Weiss et al.2025). HAM-lite represents aerosols as an ensemble of log-normal modes with prescribed sizes and compositions to reduce the computational costs. The aerosol modes are transported as prognostic tracers and are coupled with the parameterized processes mentioned above. The land is represented with the land surface model Jena Scheme for Biosphere-Atmopshere Coupling in Hamburg (JSBACH; Reick et al.2021) and interacts with the atmosphere via surface fluxes and with the ocean via hydrological discharge. It is discretized with multiple soil layers (Reick et al.2021). The ocean is modeled with hydrostatic Boussinesq equations (Korn et al.2022) and interacts with the atmosphere via the Yet Another Coupler (YAC; Hanke et al.2016). It is discretized with variable levels in the vertical that follow the movement of the ocean surface (Korn et al.2022). The ocean dynamics are spun up as described in Hohenegger et al. (2023) for Cycle 1 and as described in Appendix A for later cycles. Lastly, the sea ice dynamics are modeled based on elastic–viscous–plastic rheology (Danilov et al.2015).

2.2.2 IFS-FESOM

The Integrated Forecasting System (IFS) is developed and maintained by the European Centre for Medium-Range Weather Forecasts (ECMWF) in collaboration with institutions in its member and cooperating states. A detailed description of the model can be found in the official documentation (ECMWF2023a, b). The IFS is centered around the atmosphere coupled with other components of the Earth system such as land, ocean, and sea ice. The atmosphere and land are discretized with an octahedral-reduced Gaussian (TCo) grid (Malardel and Wedi2016). The ocean is discretized for nextGEMS with either a triangulated 0.25° grid (tORCA025) or an eddy-resolving (NG5) grid (Rackow et al.2025b).

The atmosphere is modeled with hydrostatic equations and includes parameterization schemes for radiation, turbulence, turbulent orographic form drag, subgrid-scale orographic drag, non-orographic wave drag, convection, and cloud microphysics including a prognostic cloud cover (ECMWF2023b). It is discretized with hybrid levels in the vertical, transitioning from terrain-following coordinates at lower levels to pressure-level coordinates at higher levels (Simmons and Strüfing1983). The land is represented with the land surface model ECLand (Boussetta et al.2021) and is discretized with four soil layers. In its current configuration, ECLand computes local runoff, but it does not route the runoff nor provide it as freshwater input into the ocean. Instead, the hydrological discharge is prescribed for all coastal points. In Cycle 1, the ocean is represented with the FESOM2.1 model (Wang et al.2014; Scholz et al.2019), developed and maintained by the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI). It is discretized with the tORCA025 grid using a parameterization for mesoscale eddies (Gent and Mcwilliams1990; Gent et al.1995). In the later cycles, the ocean is updated to FESOM2.5 and is discretized with the NG5 grid using an eddy-resolving resolution of 5 km (Rackow et al.2025b). In all of the cycles, the vertical dimension is discretized with arbitrary Lagrangian–Eulerian coordinates. The ocean dynamics are spun up in stand-alone mode for 5 years with atmospheric forcing from the ERA5 reanalysis (Hersbach et al.2020) before the coupled IFS-FESOM simulations are started. Lastly, waves are represented with the ecWAM model (Janssen2004), and sea ice is represented with the FESIM model (Danilov et al.2015, 2017).

2.2.3 Output convergence

A large effort was made to harmonize the model output and to facilitate an easy comparison between the two models. As an important step towards this end, ICON and IFS-FESOM both provided their output on a HEALPix grid in Cycle 4 (Górski et al.2005). HEALPix stands for Hierarchical Equal Area isoLatitude Pixelation of a sphere. The pixels of such a grid all have the same area and are located on lines of constant latitude. In addition, the hierarchical tessellation of such a grid makes it possible to visualize and process the data on different resolutions (https://healpix.sourceforge.io, last access: 2 October 2025). In ICON, the HEALPix grid was incorporated with an improved version of the YAC coupler (Hanke et al.2016) and the newly developed HIOPY module (https://gitlab.gwdg.de/ican/hiopy/, last access: 2 October 2025). In IFS-FESOM, the HEALPix grid was incorporated with the multIO framework (Sarmany et al.2024) and the MIR package (Maciel et al.2017). This achievement was made possible thanks to synergistic work with the Climate Adaptation Digital Twin of Destination Earth. Apart from the HEALPix grid, developers of ICON and IFS-FESOM worked closely together to provide a minimum set of common variables, to output atmospheric variables on a common set of pressure levels, and to output variables at a common, height-dependent frequency. From Cycle 2 onwards, IFS-FESOM provided some post-processed output of general interest, such as monthly means and coarser output on a regular 0.25° grid (Rackow et al.2025b). From Cycle 3 onwards, the output was computed on the fly by multIO (Sarmany et al.2024) and MIR (Maciel et al.2017) before writing it to disk.

3 Simulations

In this section, we describe the important characteristics of each cycle, focusing on the simulation period, computational throughput, and model biases in terms of mass and energy conservation. To refer to the simulations, we use the following conventions: ICON-CX and IFS_F-CX, in which X indicates the cycle and IFS_F stands for IFS-FESOM. In the case of two simulations from the same cycle, we added the letters A and B, e.g., ICON-C2-A and ICON-C2-B or IFS_F-C2-A and IFS_F-C2-B. Table 1 provides an overview of the simulations from Cycles 1 to 4. All simulations started on 20 January 2020. The last column shows whether the simulation was energetically consistent, as defined in Sect. 1. The simulation data were published by the World Data Center for Climate (Wieners et al.2023; Koldunov et al.2023; Wieners et al.2024).

Table 1Overview of all simulations from nextGEMS Cycles 1 to 4 performed with ICON and IFS-FESOM (IFS_F). ΔxA and ΔxO indicate the horizontal resolution of the atmosphere and ocean, respectively. Period indicates the integration time. Nodes refer to the number of CPU nodes on Levante (128 cores in total, 256 GB main memory) or an equivalent number of nodes (indicated with *). Throughput refers to the simulated days per day (SDPD), normalized by the number of nodes (SDPD node−1). The last two columns show the atmospheric radiative forcing and whether the simulation is energetically consistent. The simulation data of Cycles 2 to 4 were published by the World Data Center for Climate (Wieners et al.2023; Koldunov et al.2023; Wieners et al.2024).

Download Print Version | Download XLSX

While there is a clear increase in throughput over all of the cycles, the input/output (I/O) operations in IFS_F-C2-A and IFS_F-C2-B were extremely expensive as there were no dedicated I/O servers. The computational throughput was, therefore, less than expected for 864 nodes. This has been solved with the introduction of MultIO (Sarmany et al.2024) in IFS and FESOM in Cycle 3 and Cycle 4, as explained in Sect. 3.3 and 3.4, respectively.

3.1 Cycle 1

The first cycle started with the ICON and IFS-FESOM versions presented by Hohenegger et al. (2023) and Rackow et al. (2025b), respectively. The simulation with ICON, referred to as ICON-C1, used a horizontal grid of 5 km in the atmosphere and ocean. In addition, there were 90 vertical levels in the atmosphere, 128 vertical levels in the ocean, and five soil layers in the land. The time step was 40 s for the atmosphere (and land) and 80 s for the ocean. Note that the same simulation was presented by Hohenegger et al. (2023) under the name G_AO_5 km. ICON-C1 was integrated over 1 year on the Mistral supercomputer of DKRZ. The simulation with IFS-FESOM, referred to as IFS_F-C1, was based on IFS CY46R1 (ECMWF2019) and used a horizontal grid spacing of 4.4 km in the atmosphere and 25 km in the ocean (a triangulated version of NEMO's ORCA025 grid for use with FESOM coined “tORCA025”). IFS_F-C1 was integrated over 40 days on the Cray supercomputer of ECMWF. Both Mistral and Cray supercomputers were decommissioned in 2021 and replaced with Levante (HLRE-4 Levante2024) and Atos (ECMWF2024), respectively.

ICON-C1 and IFS_F-C1 both showed large imbalances in the energy budget. In ICON-C1, the imbalance in the atmospheric energy budget of about -4Wm-2 caused a drift in the global mean surface temperature of about 2 K (Mauritsen et al.2022; Hohenegger et al.2023). In IFS_F-C1, the imbalance in the energy budget of 6.4 W m−2 (and smaller at coarser resolutions) was related to a leak in the water budget in the atmosphere. This leak caused a spurious increase in the total precipitation of about 4.6 % (Rackow et al.2025b). Bugs related to the energy and water imbalance in ICON and IFS were already present in previous simulations at coarser horizontal resolutions and another configuration in the case of ICON, without causing any evident problems. In other words, the impacts of such bugs were negligible at coarse spatiotemporal scales and only became evident at the much finer spatiotemporal scales simulated in nextGEMS – an important lesson learned. A more detailed discussion on the identification and resolution of bugs in nextGEMS was presented by Proske et al. (2024). In the case of IFS, the proposed solutions were first tested at coarser resolutions before being implemented in its km-scale version. The coarsest resolution for scientific tests was 28 km. In the case of ICON, the proposed solutions were directly tested at finer resolutions of 10 km. Only particular technical problems were addressed with tests at coarser resolutions of 140 km. In addition, simplified idealized cases were used to isolate and debug those parts of the code that caused the energy and water leak.

On the technical side, the throughput in ICON-C1 was 17 SDPD on 400 nodes of the Mistral supercomputer, roughly corresponding to 100 nodes of Levante, whereas the throughput in IFS_F-C1 was 40 SDPD using 60 nodes of the Cray supercomputer, roughly corresponding to 20 nodes of Levante. With these numbers, ICON could simulate 0.17 SDPD per node, 10-fold less than IFS-FESOM (2 SDPD per node). As a rule of thumb, a possible explanation is the coarse horizontal grid spacing in the ocean module IFS-FESOM. The different machines used to simulate ICON-C1 and IFS_F-C1 could also be an additional reason for higher SDPD per node IFS-FESOM. Thus, in addition to the large energy imbalance, the small computational throughputs were major obstacles to the upcoming decadal simulations.

3.2 Cycle 2

In the second cycle of IFS-FESOM, the IFS version was upgraded to CY47R3 (ECMWF2021). This included, among other changes, the inclusion of a five-layer snow scheme, compared to the previously used bulk scheme. In addition, the atmospheric imbalance of water and, therefore, energy was significantly reduced to less than 1 W m−2. This was done by activating global tracer mass fixers (Rackow et al.2025b). These modifications also improved the NWP configuration of IFS-NEMO at ECMWF (Rackow et al.2025b) and were, therefore, incorporated into its operational version CY48r1. Two IFS-FESOM simulations, IFS_F-C2-A and IFS_ F-C2-B, were provided. Both simulations shared a common horizontal grid spacing in the ocean (Δx=5 km) but used different grid spacings in the atmosphere. IFS_F-C2-A used a horizontal grid spacing of 2.8 km and was integrated for 8 months, whereas IFS_F-C2-B used a horizontal grid spacing of 4.4 km and was integrated for 1 year. Both simulations were subject to a warming trend on the order of 1 K yr−1 as shown in Fig. 3.

https://gmd.copernicus.org/articles/18/7735/2025/gmd-18-7735-2025-f03

Figure 3Annual cycles of the monthly mean 2 m temperature and top-of-atmosphere radiation balance from ICON-C2-B, ICON-C3, IFS_F-C2-B, and IFS_F-C3. Note that the radiation balance is computed as the difference of the downwelling and upwelling radiation, i.e., Ftoa,net=Ftoa,sw-Ftoa,sw-Ftoa,lw (Mauritsen et al.2022). The widths and opacities of the lines increase with time. Gray line shows the observational reference averaged over 2001 to 2020, i.e., 2 m temperature from HadCRUT (Morice et al.2021) and radiation balance from CERES-EBAF (Loeb et al.2018). The months are labeled with numbers. The corresponding annual means are shown in the top-right inset. Like for the annual cycles, the widths and opacities of the lines increase with time.

Download

In the second cycle of ICON, the energy leak in Cycle 1 was traced back to bugs in the dynamical core, cloud microphysics, and surface fluxes within the turbulence scheme. Bugs in the microphysics and turbulence were resolved, and the energy imbalance was reduced. Besides these fixes, the new radiation scheme RTE-RRTMGP (Pincus et al.2019) was implemented, and a cloud inhomogeneity factor was introduced to tune the radiation balance at the top of the atmosphere, as discussed further in Sect. 4.1. In the ocean module, a new vertical coordinate system with thinner surface levels was used. Further details on the ICON development in this cycle are summarized in Appendix A. After eight test runs, two simulations were provided. ICON-C2-A used a horizontal grid spacing of 5 km and was integrated for 2 years, whereas ICON-C2-B used a horizontal grid spacing of 10 km and was integrated for 10 years. Both simulations were subject to a cold drift, as shown in Fig. 3.

IFS_F-C2-A and IFS_F-C2-B were both performed on the Atos supercomputer of ECMWF, where one node is roughly equivalent to one node in Levante. The throughput in IFS_F-C2-A (2.8 km) was 50 SDPD on 864 nodes, corresponding to a normalized throughput of 0.057 SDPD per node. Coarsening the resolution by roughly a factor of 2 in IFS_F-C2-B (4.4 km) increased the throughput by a factor of 2 on the same number of nodes, resulting in a normalized throughput of 0.116 SDPD per node.

The 17-fold reduction in the normalized throughput in IFS_F-C2-B compared to IFS_F-C1 can be explained by several factors. In Cycle 2, IFS-FESOM introduced an eddy-resolving ocean. This means an 8-fold increase in grid points and a 5-fold smaller time step, resulting in 40-fold higher costs for the ocean. More output variables were written with a higher spatial and temporal resolution, and the I/O configuration was suboptimal as it did not make use of any I/O servers yet. In addition, full ocean support for hybrid MPI-OpenMP parallelization was only supported with the release of FESOM version 2.5 (used from Cycle 3 onwards). We also decided to make use of a large number of available nodes on the Atos supercomputer at the time, despite the lack of an optimized I/O configuration. Thus, even with a considerably greater number of resources in Cycle 2 compared to Cycle 1, which sped up the computing part of the model, the total throughput including I/O did not scale accordingly, and the larger number of used nodes reduced the normalized throughput considerably.

ICON-C2-A and ICON-C2-B were both performed on the Levante supercomputer. The throughput in ICON-C2-A (5 km) was 80 SDPD on 400 nodes, meaning 0.2 SDPD per node. Coarsening the resolution by a factor of 2 in ICON-C2-B (10 km) increased the throughput by a factor of 7 on the same number of nodes, increasing the normalized throughput to 1.375 SDPD per node. With the same resolution in ICON, simulations in Cycles 1 and 2 have the same throughput per node, meaning that the increase in throughput was due to the use of more nodes.

3.3 Cycle 3

In the third cycle of IFS-FESOM, the base version of IFS was upgraded to CY48R1 (ECMWF2023a, b; Rackow et al.2025b). On top of that, the representation of uncertain cloud and microphysical processes was optimised to yield improved shortwave and longwave TOA fluxes and, as a result, a top-of-atmosphere radiation budget within observational uncertainty. The reduced cloud base mass-flux approach was introduced to increase the convective organization in simulations at 4.4 km resolution. In the land module, land use/land cover information was revised to describe km scales better (Boussetta et al.2021), and a parameterization for urban processes was added (McNorton et al.2021, 2023) in line with developments for CY49R1. Further river hydrology was enabled for selected simulations by one-way coupling precipitation and runoff output of IFS-FESOM to the Catchment-based Macro-scale Floodplain (CaMa-Flood) model v4.1 (Yamazaki et al.2014, 2011), generating river discharge and flooded fraction. FESOM was upgraded to version 2.5 with significant changes in the atmosphere-ocean coupling, such as the coupling of ocean surface velocities and accounting for the enthalpy of snow falling into the ocean (Rackow et al.2025b). With all these modifications, IFS-FESOM was integrated in time for 5 years using a horizontal grid spacing of 4.4 km in the atmosphere and 5 km on average in the ocean (IFS_F-C3). In contrast to previous cycles, IFS_F-C3 was not subject to drift in the surface temperature compared to observations, indicating an energetically consistent climate (Fig. 3). Moreover, the radiation fluxes at the top of the atmosphere were close to observations (Rackow et al.2025b).

In the third cycle of ICON, the energy imbalance was further reduced by fixing bugs in the cloud microphysics and turbulence scheme. In the calculation of surface fluxes, the heat capacity at constant pressure was replaced by the heat capacity at constant volume. ICON computes those fluxes under the assumption of constant volume (not pressure). This change reduced the amount of energy transferred from the surface to the atmosphere by about 29 %. Like in the previous cycle, the cloud inhomogeneity factor was used to tune the radiation balance. Further changes in the ocean and land modules are summarized in Appendix A. After 27 test runs, ICON was integrated for 5 years using a horizontal grid spacing of 5 km in the ocean and atmosphere. ICON-C3 exhibited a cooling trend of about 0.33 K yr−1, as shown in Fig. 3. This cold drift could be traced back to remaining energy leaks in the dynamical core.

All simulations with IFS-FESOM were performed on the Levante supercomputer from Cycle 3 onwards. The throughput per node showed a 3-fold increase in Cycle 3 (0.372 SDPD per node) compared to Cycle 2 (0.116 SDPD per node), explaining the similar throughput (100 SDPD) even if IFS_F-C3 used a third of the resources of IFS_F-C2-B. The resource efficiency was increased substantially by optimizing the I/O operations, and from Cycle 3 onwards, I/O servers via multIO were introduced in IFS-FESOM. Moreover, less 3D ocean output was written in Cycle 3 compared to Cycle 2 (only the upper 300 m at 3-hourly frequency), which dramatically reduced the time spent for I/O in IFS-FESOM. In ICON, the throughput per node in Cycle 3 (0.185 SDPD per node) was roughly similar to Cycle 2 (0.2 SDPD per node). Thus, the use of slightly more resources in Cycle 3 (530 nodes) compared to Cycle 2 (400 nodes) increased the throughput from 80 to 98 SDPD.

3.4 Cycle 4

In the fourth cycle of IFS-FESOM, given the success of providing energetically consistent simulations in Cycle 3, the last changes needed were mostly related to preparing input for climate projections. In addition, only daily output was written for ocean fields, in line with IFS-FESOM simulations for the Climate Change Adaptation Digital Twin in Destination Earth. IFS_F-C4 used a horizontal grid spacing of 9 km in the atmosphere and 5 km in the ocean and was integrated for 30 years. Additionally, CaMa-Flood was run one-way coupled to generate river hydrology on the respective period.

In Cycle 4, the big challenge in ICON was to simulate an energetically consistent climate. To this end, the following changes in the atmosphere module were made. The diffusivity in the momentum equation of the dynamical core was modified to account for the persistent but small leak in internal energy in the dynamical core. The old turbulent mixing scheme VDIFF was upgraded to enable both explicit and implicit numerical solvers, to replace the implicit atmosphere–surface coupling with an explicit coupling, and to shift the diffused thermal quantity from dry static energy to internal energy. This new upgrade is called TMX, and it contributed to a further reduction of the energy leak at the surface. Lastly, the cloud inhomogeneity factor was linked with the lower tropospheric stability to better account for the cloud type, as discussed in Sect. 4.1. Further details on the development of ICON-C4 are summarized in Appendix A. Notably, the implementations to reduce the energy leak in ICON are now taken as a base for future simulations. After numerous short test runs, ICON-C4 was integrated for 30 years using a horizontal grid spacing of 10 km in the atmosphere and 5 km in the ocean.

ICON-C4 and IFS_F-C4 both used greenhouse gas concentrations, including ozone, following the Coupled Model Intercomparison Project Phase 6 (CMIP6) for the forcing time series of a SSP3-7.0 scenario (O'Neill et al.2016). In addition, ICON-C4 used the time-varying Max Planck Institute Aerosol Climatology (MACv2-SP; Stevens et al.2017) for anthropogenic aerosols together with the Stenchikov climatology from 1999 for volcanic aerosols (Stenchikov et al.1998). IFS_F-C4 used the time-varying tropospheric climatology provided by the CONsistent representation of temporal variations of boundary Forcings in reanalysES and Seasonal forecasts (CONFESS) project (Stockdale et al.2024), in which volcanic aerosols from 1850 with no large eruptions were applied to all years after 2014.

In view of the project timeline and less than desired access to computational resources, a compromise had to be made between the model resolution and simulation length for the production simulations. This led to 30-year simulations at resolutions of about 10 km in the atmosphere and 5 km in the ocean. In ICON-C4, the 10 km grid spacing of the atmosphere allowed us to reach a throughput per node of 0.892 SDPD per node, and using 464 nodes (18 % of Levante's nodes), the total throughput was 414 SDPD. In IFS_ F-C4, the 9 km grid spacing in the atmosphere, together with further improvements in I/O operations by using multIO for both IFS and FESOM, gave a throughput per node of 2.23 SDPD per node. Using 269 nodes (9 % of Levante's nodes), IFS_F-C4 simulated 600 SDPD.

Thus, in less than 3 years of model development, nextGEMS developed two models with an energetically consistent climate, which can be a persistent issue in established climate models (e.g., Sanderson and Rugenstein2022), and with a competitive throughput. Taken together, multidecadal and global km-scale climate simulations, in which land, atmosphere, and ocean are coupled, have now become a reality. This opens new ways to analyze regional atmospheric, oceanic, and land processes on global scales and their changes in a global warming context.

4 Insights into the realism of km-scale simulations

In this part, we examine the realism of the Cycle 4 simulations outlined in Sect. 3.4. To create a storyline, we discuss four central questions on the radiation balance, key features of mean climate, local- to synoptic-scale phenomena, and timescales of regional patterns, as introduced in Sect. 1. The reader should note that robust and definite answers to these questions will form gradually as separate studies from the different scientific groups of nextGEMS emerge.

4.1 Radiation balance

As outlined in Sect. 3, it was possible to improve the conservation of mass and energy and eliminate the large drifts in the near-surface temperature observed in the early simulations. This does not imply necessarily, however, that the energy balance at the top of the atmosphere agrees with observations. In both ICON and IFS-FESOM, cloud radiative properties and formation mechanisms were adjusted to reduce the remaining biases. However, the tuning strategies in ICON and IFS-FESOM differed. The low cloud cover, for example, was too high in ICON but too low in IFS-FESOM.

In ICON, the cloud cover was adjusted in two steps targeting the turbulent mixing and cloud brightness. First, we adjusted the mixing in the Smagorinsky scheme, which depends on the Richardson number, following the formulation of Louis (1979). This adjustment allows some mixing or entrainment in situations where the traditional Smagorinsky scheme would yield no mixing. Mixing at large Richardson numbers controls stratiform boundary layer clouds but has only small effects on trade wind clouds. Second, we accounted for the lack of a cloud fraction scheme through a cloud inhomogeneity factor (ζ), which depends on the lower-tropospheric stability (LTS). LTS is defined as the difference in potential temperature between the free troposphere and the surface and is strongly correlated with the stratiform low cloud cover (Wood and Bretherton2006). In theory, the inhomogeneity factor is equal to 1 for fully resolved, homogeneous clouds and is less than 1 for partially resolved, inhomogeneous clouds, as discussed by Cahalan et al. (1994) and Mauritsen et al. (2012). It scales down the cloud water and ice before the shortwave fluxes are calculated in the radiation scheme. In ICON, the inhomogeneity factor acts only on the liquid clouds, whereas the ice clouds remain unchanged. It increases nonlinearly from about 0.4 (ζmin) at a lower-tropospheric stability of 0 K to about 0.8 (ζmax) at a lower-tropospheric stability of 30 K, i.e.,

(1) ζ = ζ min + ζ max - ζ min 1 - arctan 2 c 1 , LTS - c 2 π ,

where ζmin=0.4, ζmax=0.8, c1=2 K and c2=20 K are tuning parameters.

In IFS-FESOM, the cloud cover is a prognostic variable of the cloud microphysics scheme (ECMWF2023b). It shows a resolution and time step dependence. Overall, it is smaller at higher resolution with shorter time steps than at coarser resolutions with longer time steps. To find a configuration that is close to observations of both longwave and shortwave radiative fluxes from CERES-EBAF (Loeb et al.2018), the cloud cover was modified as documented in Rackow et al. (2025b) by

  • -

    reducing the inhomogeneity enhancement factor for accretion from 3 to 2,

  • -

    reducing the cloud edge erosion from 6×10-6 to 4×10-6, and

  • -

    assuming a constant effective cloud spacing, following recommendations by Fielding et al. (2020).

Overall, these changes led to an increase in the low cloud cover. In addition, the high cloud cover was increased in areas with strong deep convective activity by

  • -

    decreasing a threshold that limits the minimum size of the ice effective radius from 60 to 40, in agreement with observational evidence

  • -

    and changing from cubic to linear departure point interpolation in the semi-Lagrangian advection scheme for all water species except vapour.

In combination, these changes in the cloud cover in IFS-FESOM increased the outgoing shortwave radiation by about 5 W m−2, while decreasing outgoing longwave radiation by about 3 W m−2. This led to a well-balanced radiation budget at the TOA with both shortwave and longwave fluxes in realistic ranges.

Figure 4 shows the time series of the monthly mean 2 m temperature (Fig. 4a) and annual cycles of the monthly mean 2 m temperature and top-of-atmosphere radiation balance (Fig. 4b, c) over the whole simulation period. The annual means of the near-surface temperature and radiation balance are shown in the insets of Fig. 4b, c. The corresponding observations from CERES (Loeb et al.2018) and HadCRUT (Morice et al.2021) are shown as well. As discussed by Mauritsen et al. (2012, 2022), the annual cycles are shaped like a figure eight. In the first half of the year, the radiation balance decreases while the near-surface temperature increases. In the second half, the radiation balance increases while the near-surface temperature decreases. The near-surface temperature lags behind the radiation balance, as would be expected from the large heat capacity of the Earth system. Overall, the values of IFS-FESOM agree better with observations than the values of ICON. The near-surface temperatures are 13.83 °C for ICON, 14.94 °C for IFS-FESOM, and 14.91 °C for HadCRUT, averaged over the first 4 years from 2020 to 2023. And the top-of-atmosphere radiation balances are 0.25 W m−2 for ICON, 0.51 W m−2 for IFS-FESOM, and 1.46 W m−2 for CERES. Moreover, the near-surface temperature of ICON cools in contrast to IFS-FESOM. We assume that this initial cooling is related to the adjustment of the atmosphere to the ocean, which is spun up with ERA5 forcing. In the subsequent years, the radiation balance is positive, and the near-surface temperature increases in line with the SSP3-7.0 scenario in both ICON and IFS-FESOM. A detailed analysis of the shortwave reflectivity of stratocumulus clouds, which are only partially resolved in km-scale Earth System models, is presented in Sect. 4.3.2.

https://gmd.copernicus.org/articles/18/7735/2025/gmd-18-7735-2025-f04

Figure 4Time series of the monthly mean 2 m temperature from ICON-C4 and IFS_F-C4 (a). Annual cycles of the monthly mean 2 m temperature and top-of-atmosphere radiation balance from ICON-C4 (b) and IFS_F-C4 (c). Thick lines indicate averages over 2020 to 2023 (light colors) and 2046 to 2049 (dark colors), where months are labeled with numbers. Thin lines indicate individual cycles from 2020 to 2049, where widths and opacities increase with time. Gray lines show the observational reference, i.e., 2 m temperature from HadCRUT (Morice et al.2021) and radiation balance from CERES-EBAF (Loeb et al.2018). The corresponding annual means are shown in the top-right insets. As for the annual cycles, the widths and opacities increase with time.

Download

4.2 Key features of mean climate

Our next question is whether a simulation with an energetically consistent climate, together with solving the oceanic and atmospheric flows on the km-scale, can have an adequate representation of Earth's mean climatic features. To address this, we select two key large-scale features in the energetics of the climate system: (i) the tropical rainbelt, and (ii) the pattern of sea surface temperature (SST). The role played by km-scale processes in convection gives an additional reason to analyze the representation of the tropical rainbelt, which has been notoriously difficult to reproduce in coarse-resolution Earth system models (Mechoso et al.1995; Lin2007). The analysis of the SST patterns, which coarse-resolution Earth system models fall short in reproducing, is important because being able to capture it has emerged as an important factor in determining climate sensitivity. The tropical rainbelt is here defined as the quantile 80 of the yearly precipitation mean in the tropics; i.e., it corresponds to the 20 % wettest region in the tropics (30° S–30° N; see Segura et al.2022). Using ICON-C4 and IFS_F-C4, respectively, the yearly precipitation mean is computed from the first 5 complete years of integration, 2021–2025. This pragmatic choice is made here to allow nearly 1 year of spinup, from 20 January to 31 December 2020, and because computing the yearly mean for the entire 30-year period would include the impact of the SSP3-7.0 forcings on the climate. In these 5 years, the El Niño index 3.4 is in near-neutral conditions in both simulations, with the exception of 1 year in IFS_F-C4. This means that the biases and the correct representation of large-scale features are not affected by the ENSO variability.

The structure of the terrestrial tropical rainbelt is encouragingly similar between both simulations and observations from the precipitation satellite product Integrated Multi-satellitE Retrievals for GPM (IMERG; Huffman et al.2019) (Fig. 5). The location is well reproduced, albeit with a reduced area in both simulations over South America. Regarding the tropical ocean, ICON-C4 and IFS_F-C4 show a similar structure of the tropical rainbelt in the eastern Pacific that matches IMERG. However, a less consistent picture between both simulations appears over other tropical oceans. While the oceanic tropical rainbelt is relatively well reproduced in IFS_F-C4 in terms of pattern, the oceanic tropical rainbelt is less well reproduced in ICON-C4. The westward extension of the tropical rainbelt is too small in ICON in the Atlantic Ocean, while IFS does extend westward similarly to observations. The meridional extent is, however, underestimated. In the Indo-Pacific region, ICON-C4 underestimates precipitation in the equatorial region, causing the well-known double ITCZ bias in this region, observed in traditional climate models. This bias is evident in the zonal mean precipitation over the western Pacific (Fig. 5c). In the equatorial western Pacific, ICON-C4 simulates 7 mm d −1 less precipitation than what is expected from observations. In the case of IFS_F-C4, the dry bias at the equator over the western Pacific is smaller than in ICON-C4. Precipitation is 3 mm d−1 lower than IMERG in the equatorial western Pacific. The bias in IFS_F-C4 vanishes in the equatorial Pacific when IFS is coupled with the NEMO ocean model (Rackow et al.2025b), pointing to different ocean surface representations and coupling choices as potential drivers of the development of this bias in IFS. IFS_F-C4 also presents a reduced area of tropical rainbelt in the Bay of Bengal and over Southeast Asia (Fig. 5b).

https://gmd.copernicus.org/articles/18/7735/2025/gmd-18-7735-2025-f05

Figure 5On the left, tropical rainbelt in ICON-C4 (a) and IFS_F-C4 (b) averaged over 2021 to 2025. The tropical rainbelt from IMERG (Huffman et al.2019) averaged over 2001 to 2020 is outlined by black contour lines. On the right, zonal mean precipitation corresponding to the rainbelts on the left averaged over the western Pacific (c).

Overall, precipitation over tropical oceans appears to be less constrained than over land. The remaining biases in precipitation over the tropical Pacific are linked to deviations in the tropical SST patterns, as observed in traditional climate models (Lin2007). Figure 6 shows the yearly-mean pattern of SST for the 2021–2025 period of simulation in IFS_F-C4 and ICON-C4, and the pattern of SST for the HadISST climatology (2001–2020). In the Pacific, the ICON-C4 and IFS_F-C4 represent the cold–warm gradient between the eastern and western Pacific similar to observations. However, in ICON-C4, there is a westward extension of cold waters reaching the western Pacific, constraining the development of the western Pacific warm pool. In IFS_F-C4, this bias is not as prominent, with less westward extension of the cold tongue. In Fig. 6, the warmest regions in the western Pacific and the western Atlantic are weaker than in observations. The western Pacific warm pool in IFS_F-C4 is 3–4 K warmer than the tropical SST mean, while in observations the warm pool can be 4–5 K warmer than the tropical mean. The 1 K difference is also observed in the western Atlantic. In ICON-C4, the western Pacific warm pool is only 2 K warmer than the tropical mean. A similar value is observed in the western Atlantic.

https://gmd.copernicus.org/articles/18/7735/2025/gmd-18-7735-2025-f06

Figure 6Annual mean sea surface temperature (SST) from HadISST (Rayner et al.2003) averaged over 2001 to 2020 (a) and from IFS_F-C4 (b) and ICON-C4 (c) averaged over 2021 to 2025. The spatial mean, denoted as μ in the subplot titles, is subtracted from each product. The 80th percentile is outlined by black contour lines.

The comparison between ICON-C4 and IFS_F-C4 raises some preliminary conclusions pointing to the positive and negative surprises in developing the two models. The fact that ICON-C4 can represent the tropical rainbelt over land and the eastern Pacific indicates that a horizontal grid spacing of the order of 10 km is sufficient to reproduce the structure of precipitation in those regions, and that is possible with a minimum set of parameterizations. This is supported by the fact that across the nextGEMS cycles, the structure of the tropical rainbelt presents negligible changes in the eastern Pacific and land in ICON with horizontal grid spacing finer than 10 km. Indeed, the pattern of the tropical rainbelt over those regions is similar to the one presented by Segura et al. (2022).

On the other hand, using a grid spacing of 10 km is not sufficient to represent the tropical rainbelt in the western Pacific. To address this bias, IFS and ICON take different pathways, but both involve model tuning. IFS addresses this issue by using a convective parameterization. This is based on the long history of IFS in model tuning to match the observed precipitation pattern. ICON, using a simplistic framework regarding parameterizations, aims to address the warm pool precipitation bias by fine-tuning subgrid-scale processes (microphysics and turbulence). The results from Takasuka et al. (2024) and Segura et al. (2025a) show that getting a single tropical rainbelt in the western Pacific is possible with this simplistic framework. While Takasuka et al. (2024) and Segura et al. (2025a) used SST-prescribed simulations, the next step in the development of ICON is to include the changes proposed by these authors in coupled simulations. Moreover, the difference between ICON-C4 and IFS_F-C4 shows that a better representation of the equatorial SST pattern and the tropical rainbelt are linked, suggesting that once the tropical rainbelt is well reproduced, the SST pattern might follow.

4.3 Local- to synoptic-scale phenomena

In Sect. 4.1 we demonstrated that the nextGEMS simulations result in an energetically consistent global climate. As shown in Sect. 4.2, some of the typical spatial precipitation and SST patterns in the Pacific are acceptably reproduced, but some long-standing issues remain. In the following subsections, we investigate to what extent an energetically consistent climate translates into a constraint for local, meso-scale and synoptic-scale phenomena.

4.3.1 Local-scale phenomena

We here take the soil moisture–precipitation feedback as an exemplifying process of how the two models represent a local-scale phenomenon. The reasoning behind this is that local convection, on scales below 100 km, plays a dominant role in this coupling. The soil moisture–precipitation feedback is already apparent in the first year (Fig. 7). Both ICON-C4 and IFS_F-C4 simulations generally reproduce similar patterns of where strong and weak soil moisture–precipitation feedback occurs within the first 5 years. This is quantified by the correlation coefficient between total-column soil moisture and precipitation during the boreal summer. The two models agree with each other on the regions with relatively strong feedback, such as Mexico, the southern United States, and the Sahel. These strong feedback regions align well with the hotspots of land–atmosphere coupling where evapotranspiration is strongly controlled by soil moisture (Koster et al.2004). The models do not share all the features. ICON-C4 generally shows weaker feedback strength globally than IFS_ F-C4. The reason for the weaker feedback can be attributed, in part, to how each model represents convection (Lee and Hohenegger2024). Convective parameterization causes precipitation to be more sensitive to surface evapotranspiration, leading to stronger feedback compared to the results with explicit convection (Hohenegger et al.2009). The time series of land–atmosphere feedback over Europe, the Sahel, the United States, and Mexico show that the correlation coefficients established in the first year remain relatively stable over time, with only minor fluctuations due to interannual variability.

https://gmd.copernicus.org/articles/18/7735/2025/gmd-18-7735-2025-f07

Figure 7Soil moisture–precipitation feedback quantified with the correlation between daily mean total-column soil moisture (SM) and precipitation (P) during JJA of 2021 to 2025 for ICON-C4 (a) and IFS_F-C4 (b). Time series of the correlation coefficient (c) in ICON-C4 (full lines) and IFS_F-C4 (dashed lines) averaged over Europe [black box in panels a and b], Sahel [orange box in panels a and b], and the United States and Mexico [blue box in panels a and b]. Areas where the precipitation is smaller than 0.1 mm d−1 in both simulations are grayed out.

4.3.2 Meso-scale phenomena

As an example of a meso-scale phenomenon, we look into maritime stratocumulus cloud fields, typically occurring over cool currents off the western side of continents.

Of relevance for the representation of stratocumulus clouds in nextGEMS models were the following specifications. First of all, we note that ICON-C4 refuses any parameterization of convective clouds, while IFS_ F-C4 employs a shallow convection scheme as well as a modified deep convection scheme. It was already discussed in Sect. 4.1 that for both ICON-C4 and IFS_F-C4 the cloud inhomogeneity factor, which addresses the subgrid-scale inhomogeneities of clouds, was adjusted under the assumption that, on the finer grids, greater inhomogeneity is explicitly represented. For IFS_F-C4 also a parameter controlling cloud edge erosion was reduced in order to increase cloud cover of low clouds. Both of these parameters affect cloud–radiation interaction and, as such, impact the radiation balance at the top of the atmosphere, showing stratocumulus clouds as a good example of how microscale processes impact global scales.

In Fig. 8a, we show the annual cycle of the cloud water path for IFS_F-C4 and ICON-C4 and cloud area fraction for IFS_F-C4 and CERES satellite observations, averaged over the ocean near the coast of California, one of the main stratocumulus regions (Klein and Hartmann1993). Figure 8b shows albedo, given as the fraction of top-of-atmosphere outgoing shortwave radiative flux (RSUT) and incoming shortwave radiative flux, as well as RSUT individually. Note that for the simulations, we average over the years between 2021 and 2025, and for observations between 2000 and 2023, meaning that the periods hardly overlap. We recognize from the cloud properties and albedo two maxima in CERES observations, one in summer and one in winter. The annual cycle in albedo is well represented in ICON-C4, showing agreement in both summer and winter peaks. Disregarding the systematic differences of cloud water path, we find that particularly the summer peak is well represented in ICON-C4. In turn, IFS_F-C4 simulates a too-early peak in May and one in winter. Looking at RSUT, we find that the summer peak dominates the radiative effects of the Californian stratocumulus clouds. ICON-C4 simulates the annual cycle well, while IFS_F-C4 does not reach the summer peak amplitude in RSUT, which is in line with the absent summer peak in the cloud water path. The annual cycle of Californian stratocumulus presented here is well in line with the one discussed in a separate study by Nowak et al. (2025) on stratocumulus in km-scale Earth system models. These authors also examined the representation of shallow cumulus.

From the annual cycles shown in Fig. 8, we identify summer (JJA) as the radiatively most relevant season of marine stratocumulus off the coast of California. In Fig. 9 we show the associated spatial maps of RSUT for JJA along with the cloud properties and RSUT biases, here averaged over the 2021–2025 period for both simulations and the 2016–2020 period for CERES. For ICON-C4, but particularly for IFS_F-C4, RSUT is underestimated. In CERES, the stratocumulus cloud field off the coast of California, as shown in Fig. 9c, can be seen in cloud fraction and relates clearly to RSUT. For ICON-C4 and IFS_F-C4, we find that clouds have too small a liquid water path or are too far off the coast. As a consequence, associated biases in RSUT are substantial.

https://gmd.copernicus.org/articles/18/7735/2025/gmd-18-7735-2025-f08

Figure 8Annual cycle of stratocumulus cloud and radiative properties, averaged over the years 2021 to 2025 and over the ocean west of the Californian coast (longitude [115, 127.5° W] and latitude [27.5, 38° N], blue rectangle in Fig. 9. Panel (a) shows column cloud water (kg m−2, solid lines) and cloud area fraction (%, dashed lines). Panel (b) shows the the albedo (, solid lines) and top-of-atmosphere upward radiative flux (RSUT/W m−2, dashed lines). Note that for ICON-C4 cloud fraction is not available, while for CERES column cloud water is not available.

Download

https://gmd.copernicus.org/articles/18/7735/2025/gmd-18-7735-2025-f09

Figure 9Top-of-atmosphere upwelling shortwave flux (RSUT) over the ocean next to the Californian coast in JJA for ICON-C4 (a), IFS_F-C4 (b) and CERES (Loeb et al.2018) (c), along with the cloud (liquid) water path (kg m−2) for nextGEMS Cycle 4 models [2021, 2025] and cloud area fraction (%) for CERES (2016, 2020). Panels (d) and (e) show biases in RSUT of ICON-C4 and IFS_F-C4 relative to CERES, respectively. The blue rectangle denotes the area considered for computing RSUT statistics in Fig. 8.

We summarize that the nextGEMS simulations' energetically consistent climate comes along with a stratocumulus cloud field in the Bay of California whose radiative effects (RSUT) are well in phase with that of CERES satellite observations. In ICON-C4, we find both the bi-modality of the annual cycle of cloudiness and albedo, as well as the peak amplitude of RSUT, matching well with observations. For IFS_F-C4, just like in CMIP5 and CMIP6 (Jian et al.2021), the peak amplitude of RSUT is underestimated, particularly in summer, when stratocumulus clouds have their strongest radiative effects. The reasons for this are the absence or ocean-wards dislocation of stratocumulus, potentially inaccurate cloud-radiative properties, and under-resolved mixing processes of stratocumulus. Eventually, it would be of great interest to investigate the implementation of aerosol effects on cloud formation and brightening in km-scale Earth system models in regard to stratocumulus clouds and their radiative effects.

Whether a realistic representation of global climate constrains patterns of regional phenomena cannot be fully answered because a proof of a causal relationship between the global climate and regional circulation is not feasible. Still, the representation of the exemplary regional circulation, stratocumulus cloud fields, is found to be satisfactory. In particular, for ICON-C4, despite not employing a parameterization of shallow convection, we find that not only the phase but also the amplitude of stratocumulus cloud radiative effects agree very well with observations (Nowak et al.2021). This is an encouraging result that demonstrates the capabilities of km-scale Earth system models, as well as that instead of using convective parameterization the fine-tuning of cloud microphysics is key to representing stratocumulus clouds.

4.3.3 Synoptic-scale phenomena

Atmospheric blocking is a key feature of mid-latitude synoptic-scale circulation, often linked to weather extremes such as heatwaves and cold spells. Here, we use atmospheric blocking as an example of a synoptic phenomenon. Figure 10a and b illustrate the climatology of blocking events (Fig. 10a, b) and the 2021–2025 time series of blocking events for the “North Pacific” and “Central Europe” regions (Fig. 10c). Overall, there is good agreement in the geographical distribution of atmospheric blocking between ICON-C4 and IFS_F-C4. However, IFS_F-C4 exhibits a higher frequency of blocking events compared to ICON-C4.

https://gmd.copernicus.org/articles/18/7735/2025/gmd-18-7735-2025-f10

Figure 10Annual blocking frequency in IFS_F-C4 (a) and ICON-C4 (d) during 2021–2025. Units are in percentage of blocked days relative to the total number of days per year. ERA5 climatology (2015–2019) is shown in green contours (2 % intervals starting at 2 %). Evolution of the number of blocks (c) in ICON-C4 (full lines) and IFS_F-C4 (dashed lines) passing over central Europe [blue box in panels a and b] and the North Pacific [purple box in panels a and b]. Blocking is identified as a persistent and quasi-stationary mid-level (500 hPa) geopotential height anomaly of the flow following the Schwierz index (Schwierz et al.2004).

We hypothesize that the convergence in blocking location results between these models is influenced by their shared utilization of kilometer-scale orographic information. As demonstrated by Davini et al. (2022) using coarser-resolution simulations, higher-resolution orography significantly enhances the representation of atmospheric blocking. Specifically, the improved performance of IFS-FESOM in capturing blocking frequency can be attributed to its implementation of turbulent orographic form drag and subgrid-scale orography parameterizations, both of which are known to enhance the representation of atmospheric circulation features (e.g., Woollings et al.2018).

The time series of annual blocking frequencies for the North Pacific and Central Europe regions reveal substantial interannual variability in both IFS_F-C4 and ICON-C4. Note that recent studies indicate that there is no significant trend in blocking frequency over the past decades (Wazneh et al.2021). Traditional climate models tend to underestimate atmospheric blocking frequency (e.g., Dolores-Tesillos et al.2025); thus, whether blocking will increase or decrease is still an open question (Berckmans et al.2013; Davini and D’Andrea2016; Schiemann et al.2017; Woollings et al.2018).

4.4 Timescales of regional patterns

The analysis in Sect. 4.1 brought another positive surprise: the TOA energy balance responded quite quickly to changes in the parameterization schemes. The tuning of the TOA energy balance shown in Sect. 4.1 was achieved in parallel in ICON and IFS-FESOM by performing a number of short integrations. In the process, selected parameter values were modified within observational uncertainty to obtain a simulated TOA energy balance similar to the observed one. Indeed, the integration time was no longer than 12 days. This strategy was further used in IFS for testing across different resolutions. This means that with only a few days of km-scale simulations, one can build enough confidence on whether changes in parameter values converge to an improved response. Such a finding reinforces the possibilities of these km-scale models to be used more routinely and in an operational context.

In the same line, robust regional circulation patterns in Sect. 4.2, 4.3.1 and 4.3.2 emerge within relatively short time spans, e.g., 1-year periods. The double band of precipitation in the western Pacific in ICON-C4 is present every year of the simulation and is already observed in the first 3 months of the simulation (not shown). The annual cycle of stratocumulus clouds is already established in the first year (Fig. 8). Likewise, the soil moisture–precipitation feedback emerges already during the first year of simulations (Fig. 7). The lesson that we want to communicate here is that it is unnecessary to run lengthy simulations to understand how parameterization changes will affect the TOA energy balance, the land–atmospheric coupling, the double ITCZ, and stratocumulus clouds. Thus, this lesson could be helpful for communities working on those problems. On the other hand, the large interannual variability on atmospheric blockings both in ICON and IFS-FESOM combined with the few occurrences of blocking in a given year (Fig. 10) suggests that multidecadal simulations are required to assess the performance and evolution of blocking frequencies properly.

5 Conclusions

In 2021, nextGEMS started with the goal of producing for the first time multidecadal climate simulations, in which the governing equations are solved at a horizontal resolution on the order of 10 km or finer in the ocean, land, and atmosphere. With the resources available, nextGEMS provided 30-year climate simulations (from 2020 to 2050) under the SSP3-7.0 scenario with two different Earth system models, ICON and IFS-FESOM, with a horizontal grid spacing of 10 km in the atmosphere and 5 km in the ocean. The 30-year climate simulations were run on the supercomputer Levante, using 18 % of its capacity in the case of ICON and 9 % in the case of IFS-FESOM. While the limited computing resources forced the final horizontal grid spacing to be 10 km, nextGEMS prepared ICON and IFS-FESOM to be run on more powerful supercomputers than Levante, exploiting computer resources to produce climate simulations with a horizontal grid spacing of 5 km (or finer). The achievement of the 30-year climate simulations inspired this overview. Here, we presented the concept and progress of the project structured into four cycles and hackathons. In addition, we discussed the surprises we encountered and the lessons we learned when developing the models and analyzing the simulations. To create a storyline, we translated our learning process into four questions defined in Sect. 1.

Over the four cycles, the simulations with ICON and IFS-FESOM evolved from year-long simulations in Cycle 1 with significant mass and energy leaks to multidecadal simulations in Cycle 4 with an energetically consistent climate; i.e., the response of the surface temperature to the radiative forcing was comparable to observations. The two 30-year simulations of Cycle 4, with ICON and IFS-FESOM, had a horizontal grid spacing of about 10 km in the atmosphere and land and 5 km in the ocean. Notably, both simulations had a competitive throughput of 414 simulated days per day on 464 nodes for ICON and 600 simulated days per day on 269 nodes for IFS-FESOM.

In both models, ICON and IFS-FESOM, it was relatively easy to tune the cloud properties and bring the TOA radiation balance close to observations. The main reason was the quick response of the TOA radiation balance to changes in the cloud properties, allowing us to find optimal parameters with short simulations of about 12 days. The cloud properties were adjusted in different ways in ICON and IFS-FESOM. While ICON targeted shallow and low-level stratocumulus clouds, IFS-FESOM targeted all types of clouds. Despite these different approaches, the bias in the TOA radiation balance could be reduced in both models, with a closer fit in IFS-FESOM.

The comparison between ICON and IFS-FESOM also gave insights into the representation of the mean climate. A km-scale model with no convection parameterization like ICON can reproduce the observed structure of the tropical rainbelt over land in the eastern Pacific, similar to IFS-FESOM, which uses a convection parameterization. However, ICON can not reproduce the pattern of the tropical rainbelt (still showing a double ITCZ) or the SST patterns in the western Pacific. Here, IFS-FESOM behaves more similarly to observations. Thus, solving explicitly km-scale processes with a grid spacing of 10 km is not sufficient to solve biases in the western Pacific, but fine-tuning in km-scale Earth system models can reduce the double ITCZ bias (Takasuka et al.2024; Segura et al.2025a).

We also observed in ICON and IFS-FESOM that the seasonal cycle of stratocumulus clouds, and pattern of atmospheric blocking are well represented, i.e., similar to observations. This means that these regional phenomena are well constrained in km-scale climate simulations. However, there are some differences. ICON reproduces the seasonal cycle of the stratocumulus clouds better than IFS-FESOM. The different pathways for fine-tuning radiative properties of low-level clouds explain the discrepancy between ICON and IFS-FESOM. We also observe that the use of convective parameterization is not necessary to represent this regional phenomenon correctly. On the other hand, the frequency of atmospheric blocking occurrence is better represented by IFS-FESOM, signaling the importance of sub-scale processes to represent the frequency of this synoptic circulation.

Our analysis also showed that short simulations of 1 year or even less are sufficient to develop and test many aspects of km-scale models such as biases in the tropical rainbelt, in the top-of-atmosphere radiation balance, the seasonal cycle of stratocumulus clouds, or the soil moisture–precipitation feedback. This finding is particularly encouraging for other groups who work on resolving these biases or understanding the changes in regional circulation patterns under different climate scenarios. The resources for such short simulations are available on any of the EuroHPC systems, which brings us one step closer to the vision of using km-scale models in a research and operational setting.

https://gmd.copernicus.org/articles/18/7735/2025/gmd-18-7735-2025-f11

Figure 11Scene of aerosols on 5 September 2020 at 00:00 UTC. Shown are column burdens of dust (du) in red, sea salt (ss) in blue, carbonaceous aerosol (ca) in green, and sulfuric aerosol (su) in yellow. The colormaps have a variable transparency which decreases from fully transparent at minima to fully opaque at maxima.

6 Outlook

The development of ICON and IFS-FESOM continues within and beyond nextGEMS. To further increase the resolution and throughput, modeling centers are increasingly making use of exascale supercomputers such as the pan-European Large Unified Modern Infrastructure (LUMI2024). ICON, and to a lesser extent IFS-FESOM, have been adapted to perform well on such machines primarily based on graphical processing units (Adamidis et al.2025; Bishnoi et al.2024). nextGEMS collaborates with other projects, such as WarmWorld (https://warmworld.de, last access: 2 October 2025), on the optimization and modularization of the code base of ICON, and with EERIE (https://eerie-project.eu, last access: 2 October 2025), on the role of mesoscale processes in the ocean.

To improve the representation of the Earth system, ICON is currently including more components, such as the HAMOCC module for ocean biogeochemistry (Ilyina et al.2013) and the HAM-lite module for interactive aerosols (Weiss et al.2025). HAMOCC and HAM-lite have been evaluated with complementary simulations of up to 1 year. In preliminary simulations with ICON-HAMOCC at 10 km resolution, biochemistry patterns at the regional scale appeared well reproduced in the case of western African waters. On the other hand, sea surface chlorophyll a begins to be reconstructed (Roussillon et al.2023), which can be used to forecast marine productivity and its effect on shifts of exploited fish populations (Sarre et al.2024) as well as trends in pelagic biomass (Diogoul et al.2021). In simulations with ICON-HAM-lite at 5 km resolution, key aerosol processes are captured including, for example, the formation of dust storms in the Sahara, wind-driven emissions of sea salt aerosols, or wildfire-driven emissions of carbonaceous aerosols. Figure 11 shows a scene of aerosol burdens on 5 September 2020 at 00:00 UTC. These examples show how the simulations produced so far and those to come can be of interest for industry sectors such as solar and wind energy or agriculture and fisheries.

The two models developed in nextGEMS are also prototypes for the Climate Change Adaptation Digital Twin of the Destination Earth initiative (https://destination-earth.eu, last access: 2 October 2025). The initiative aims to operationalize km-scale and multidecadal climate simulations, assess the impacts of climate change, and evaluate adaptation strategies at local and regional scales (Hoffmann et al.2023; Sandu2024). For that initiative, the first projections from 2020 to 2040 were produced with ICON, IFS-NEMO, and IFS-FESOM at resolutions of 5 and 4.4 km in the atmosphere and 5–10 km in the ocean, respectively. Moreover, the work on output harmonization in nextGEMS fed back into Destination Earth. The output of the models underpinning the Digital Twin is converted into a Generic State Vector, a set of selected variables in consistent units, frequencies, and on a common grid, which is then streamed to the different applications. nextGEMS also participated in the Global KM-scale Hackathon of the World Climate Research Programme by providing several simulations with ICON and IFS-FESOM, including some at 2.5 km resolution over 14 months. These simulations are publicly available on GitHub (https://digital-earths-global-hackathon.github.io/catalog/, last access: 2 October 2025) together with those of many other participating models.

nextGEMS is the first step towards a European ecosystem of km-scale climate research and operational Earth system models, to which WarmWorld, EERIE, Destination Earth, and other projects contribute. These projects embody a cross-European effort to push forward the boundaries of climate science by reducing model uncertainties and providing climate information on local and regional scales where the impacts of climate change are felt (Prein et al.2015; Gettelman et al.2023). In this context, the Earth Virtualization Engine initiative (https://eve4climate.org, last access: 2 October 2025) understood the necessity to provide fair free access to climate information with local granularity, globally, using the best technology at hand (Stevens et al.2024). Thus, the km-scale Earth system models developed in nextGEMS will not only serve scientific purposes but will also be tools to address critical questions for society and ecosystems regarding climate adaptation, mitigation, and risk management.

Appendix A: Summary of model developments

Here, we summarize all developments in ICON over Cycles 2 to 4.

A1 Cycle 2

In the atmosphere module, several bugs in the dynamical core, cloud microphysics, and surface fluxes were removed, which reduced the energy imbalance significantly. The old PSrad radiation scheme was replaced with the new RTE-RRTMGP scheme (Pincus et al.2019). A cloud inhomogeneity factor was introduced to tune the radiation balance at the top of the atmosphere. In the land module, the GSWP3 input data of 2014 (Dirmeyer et al.2006) was used to spin up the land reservoir for the hydrological discharge. The equilibrium was reached after 5 years using a time step of 40 s. The distribution area of freshwater entering the ocean was increased to a radius of 30 km. In the ocean module, a new vertical coordinate system with thinner surface levels was introduced, and a new surface flux scheme was implemented. Moreover, the spinup procedure was revised. The spinup simulation was initialized with ORAS5 (Copernicus Climate Change Service2025) and forced with ERA5 (Hersbach et al.2020) from 2010 to 2020 to establish a stationary eddy field. The sea surface temperature was nudged to ERA5 in the last year. After eight test runs, two simulations were provided. ICON-C2-A used a horizontal grid spacing of 5 km and was integrated for 2 years, whereas ICON-C2-B used a horizontal grid spacing of 10 km and was integrated for 10 years. ICON-C2-A and ICON-C2-B were both performed on 400 nodes. The throughput was 80 SDPD for ICON-C2-A and 550 SDPD for ICON-C2-B. Coarsening the resolution by a factor of 2 increased the throughput by a factor of 7.

A2 Cycle 3

In the atmosphere module, the energy imbalance was further reduced by removing bugs in the cloud microphysics and turbulence scheme. The surface fluxes were calculated with the heat capacity at constant pressure instead of constant volume, which reduced the amount of energy transferred from the surface to the atmosphere by about 29 %. In the land module, the heat capacity and conductivity maps were revised and the soil texture was accounted for in the hydrology scheme. In the ocean module, the vertical mixing was solved including Langmuir turbulence in the turbulent kinetic energy scheme. The turbulent diffusion coefficient was decreased from 0.2 to 0.1. The tracer advection was treated with a new second-order scheme. The spinup of the ocean was conducted in the same manner as in Cycle 2 except for the year 2019 being repeated. The dynamical core was solved with mixed precision, and the sea level pressure was coupled to the atmosphere. In contrast to the previous cycles, the output was compiled on the HEALpix grid (Górski et al.2005). After 27 test runs, ICON-C3 was integrated over 5 years using a horizontal grid spacing of 5 km in the ocean and atmosphere. The throughput of ICON-C3 was 98 SDPD on 530 nodes compared to 80 SDPD on 400 nodes in ICON-C2-A.

A3 Cycle 4

In the atmosphere module, the diffusivity in the conservation of momentum was modified to account for the persistent leak of internal energy. The turbulent mixing scheme VDIFF was updated and renamed to TMX. This update came with several key improvements: refactoring the code into an object-oriented structure, enabling both explicit and implicit numerical solvers, replacing the implicit atmosphere–surface coupling with an explicit coupling, and replacing the dry static energy with the internal energy in the diffusion scheme. Lastly, the cloud inhomogeneity factor was linked with the lower tropospheric stability to better account for the cloud type. In the ocean module, the number of vertical levels was decreased from 128 to 72 levels, while keeping thin layers in the upper ocean, and the turbulent mixing under sea ice was reduced. The ocean state spinup was conducted in the same manner as in Cycle 2 but without nudging of the sea surface temperature. ICON-C4 used ozone and greenhouse gas concentrations following CMIP6 for the forcing time series of a SSP3-7.0 scenario (O'Neill et al.2016). In addition, ICON-C4 used the time-varying MACv2-SP climatology (Stevens et al.2017) for anthropogenic aerosols together with the Stenchikov climatology from 1999 for volcanic aerosols (Stenchikov et al.1998). After numerous long test runs, ICON-C4 was integrated over 30 years using a horizontal grid spacing of 10 km in the atmosphere and 5 km in the ocean. The 10 km grid spacing of the atmosphere, allowed us to reach a throughput of 414 SDPD on 464 nodes.

Code availability

The ICON model is available on the WDCC at https://doi.org/10.35089/WDCC/IconRelease01 (ICON partnership (DWD, MPI-M, DKRZ, KIT, C2SM)2024) under a permissive open-source licence (https://opensource.org/license/BSD-3-clause, last access: 2 October 2025). The FESOM2.5 model is a free software and available on GitHub (https://github.com/FESOM/fesom2, last access: 2 October 2025). The latest version 2.5 including all developments used in nextGEMS Cycle 3 is archived on Zenodo at https://doi.org/10.5281/zenodo.10225420 (Rackow et al.2023b). MultIO is a free software and available on the GitHub of ECMWF (https://github.com/ecmwf). The IFS model is available subject to a licence agreement with the ECMWF. ECMWF member-state weather services and approved partners will be granted access. The IFS code without modules for data assimilation is available for educational and academic purposes via an OpenIFS licence (http://www.ecmwf.int/en/research/projects/openifs, last access: 2 October 2025). The IFS code modifications for nextGEMS are archived separately on Zenodo at https://doi.org/10.5281/zenodo.10223576 (Rackow et al.2023a). The data and scripts that we used to generate the figures are available in the Open Research Data Repository of the Max Planck Society at https://doi.org/10.17617/3.QZHXMC (Segura et al.2025b).

Data availability

The simulation data are openly accessible and archived in the World Data Center for Climate at https://doi.org/10.26050/WDCC/nextGEMS_cyc2 (Wieners et al.2023), https://doi.org/10.26050/WDCC/nextGEMS_cyc3 (Koldunov et al.2023), and https://doi.org/10.35095/WDCC/next GEMS_prod_addinfov1 (Wieners et al.2024). Namelist files and settings for the 30-year Cycle 4 production simulation with IFS-FESOM are archived on Zenodo at https://doi.org/10.5281/zenodo.14725225 (Rackow et al.2025a). The IMERG data (Huffman et al.2019) were downloaded from the Integrated Climate Center website at https://www.cen.uni-hamburg.de/en/icdc/data/atmosphere/imerg-precipitation-amount.html (last access: 2 October 2025).

Author contributions

HS, XP, PW, SM, TR, JL, ED, and IB analyzed the data, prepared the figures, and drafted the paper. MA, RA, GA, AB, JB, SB, EB, TB, SB, HB, NB, LB, SC, SD, JD, ID, PD, ME, JE, ME, RF, CF, LF, DG, PG, PG, AG, KG, MG, OG, HH, IH, KH, SH, JH, LK, AK, NK, TK, SK, SK, JK, PK, AK, RL, NM, MM, SM, KM, BN, JN, DP, UP, DP, RR, DS, DS, RS, PS, DS, DS, BS, DT, AT, AU, MV, MV, AV, SW, FW, MW, NW, KW, JW, MW, YW, FZ, and JZ contributed to the project and revised the paper. FB, DB, SB, SB, SB, PB, MD, ED, SF, EF, CH, CH, HJ, MJ, TJ, JJ, NK, DK, HK, MK, SM, OM, TM, JM, TM, EM, HP, KP, AS, PS, PS, LT, PV, IS, and BS conceptualized and contributed to the project and revised the paper.

Competing interests

At least one of the (co-)authors is a member of the editorial board of Geoscientific Model Development. The peer-review process was guided by an independent editor, and the authors also have no other competing interests to declare.

Disclaimer

Publisher’s note: Copernicus Publications remains neutral with regard to jurisdictional claims made in the text, published maps, institutional affiliations, or any other geographical representation in this paper. While Copernicus Publications makes every effort to include appropriate place names, the final responsibility lies with the authors. Also, please note that this paper has not received English language copy-editing. Views expressed in the text are those of the authors and do not necessarily reflect the views of the publisher.

Acknowledgements

This research was supported by the Horizon 2020 project nextGEMS under grant agreement no. 101003470. Most simulations were performed and analyzed on facilities of the DKRZ (HLRE-4 Levante2024) with resources granted under project bm1235. We would like to thank DKRZ staff for their continued support in running the simulations and hosting and handling the data, in particular Jan Frederik Engels, Hendryk Bockelmann, Fabian Wachsmann, Irina Fast, and Carsten Beyer. We also want to thank the two anonymous reviewers for their insightful comments.

Financial support

This research has been supported by the EU Horizon 2020 (grant no. 101003470).

The article processing charges for this open-access publication were covered by the Max Planck Society.

Review statement

This paper was edited by Lele Shu and reviewed by two anonymous referees.

References

Adamidis, P., Pfister, E., Bockelmann, H., Zobel, D., Beismann, J.-O., and Jacob, M.: The real challenges for climate and weather modelling on its way to sustained exascale performance: a case study using ICON (v2.6.6), Geosci. Model Dev., 18, 905–919, https://doi.org/10.5194/gmd-18-905-2025, 2025. a

Baker, A. J., Vannière, B., and Vidale, P. L.: On the Realism of Tropical Cyclone Intensification in Global Storm-Resolving Climate Models, Geophysical Research Letters, 51, e2024GL109841, https://doi.org/10.1029/2024GL109841, 2024. a

Baldauf, M., Seifert, A., Förstner, J., Majewski, D., Raschendorfer, M., and Reinhardt, T.: Operational Convective-Scale Numerical Weather Prediction with the COSMO Model: Description and Sensitivities, Monthly Weather Review, 139, 3887–3905, https://doi.org/10.1175/MWR-D-10-05013.1, 2011. a

Bao, J., Stevens, B., Kluft, L., and Muller, C.: Intensification of daily tropical precipitation extremes from more organized convection, Science Advances, 10, eadj6801, https://doi.org/10.1126/sciadv.adj6801, 2024. a

Becker, T., Bechtold, P., and Sandu, I.: Characteristics of convective precipitation over tropical Africa in storm-resolving global simulations, Quarterly Journal of the Royal Meteorological Society, 147, 4388–4407, https://doi.org/10.1002/qj.4185, 2021. a

Berckmans, J., Woollings, T., Demory, M.-E., Vidale, P.-L., and Roberts, M.: Atmospheric blocking in a high resolution climate model: influences of mean state, orography and eddy forcing, Atmospheric Science Letters, 14, 34–40, https://doi.org/10.1002/asl2.412, 2013. a

Birch, C. E., Roberts, M. J., Garcia-Carreras, L., Ackerley, D., Reeder, M. J., Lock, A. P., and Schiemann, R.: Sea-Breeze Dynamics and Convection Initiation: The Influence of Convective Parameterization in Weather and Climate Model Biases, Journal of Climate, 28, 8093–8108, https://doi.org/10.1175/JCLI-D-14-00850.1, 2015. a

Bishnoi, A., Stein, O., Meyer, C. I., Redler, R., Eicker, N., Haak, H., Hoffmann, L., Klocke, D., Kornblueh, L., and Suarez, E.: Earth system modeling on modular supercomputing architecture: coupled atmosphere–ocean simulations with ICON 2.6.6-rc, Geosci. Model Dev., 17, 261–273, https://doi.org/10.5194/gmd-17-261-2024, 2024. a

Boussetta, S., Balsamo, G., Arduini, G., Dutra, E., McNorton, J., Choulga, M., Agustí-Panareda, A., Beljaars, A., Wedi, N., Munõz-Sabater, J., de Rosnay, P., Sandu, I., Hadade, I., Carver, G., Mazzetti, C., Prudhomme, C., Yamazaki, D., and Zsoter, E.: ECLand: The ECMWF Land Surface Modelling System, Atmosphere, 12, https://doi.org/10.3390/atmos12060723, 2021. a, b

Cahalan, R. F., Ridgway, W., Wiscombe, W. J., Bell, T. L., and Snider, J. B.: The Albedo of Fractal Stratocumulus Clouds, Journal of Atmospheric Sciences, 51, 2434–2455, https://doi.org/10.1175/1520-0469(1994)051<2434:TAOFSC>2.0.CO;2, 1994. a

Chassignet, E. P., Yeager, S. G., Fox-Kemper, B., Bozec, A., Castruccio, F., Danabasoglu, G., Horvat, C., Kim, W. M., Koldunov, N., Li, Y., Lin, P., Liu, H., Sein, D. V., Sidorenko, D., Wang, Q., and Xu, X.: Impact of horizontal resolution on global ocean–sea ice model simulations based on the experimental protocols of the Ocean Model Intercomparison Project phase 2 (OMIP-2), Geosci. Model Dev., 13, 4595–4637, https://doi.org/10.5194/gmd-13-4595-2020, 2020. a

Copernicus Climate Change Service: ORAS5 global ocean reanalysis monthly data from 1958 to present, Climate Data Store [data set], https://doi.org/10.24381/cds.67e8eeb7, 2025. a

Danilov, S., Wang, Q., Timmermann, R., Iakovlev, N., Sidorenko, D., Kimmritz, M., Jung, T., and Schröter, J.: Finite-Element Sea Ice Model (FESIM), version 2, Geosci. Model Dev., 8, 1747–1761, https://doi.org/10.5194/gmd-8-1747-2015, 2015. a, b

Danilov, S., Sidorenko, D., Wang, Q., and Jung, T.: The Finite-volumE Sea ice–Ocean Model (FESOM2), Geosci. Model Dev., 10, 765–789, https://doi.org/10.5194/gmd-10-765-2017, 2017. a

Davini, P. and D’Andrea, F.: Northern Hemisphere atmospheric blocking representation in global climate models: twenty years of improvements?, Journal of Climate, 29, 8823–8840, https://doi.org/10.1175/JCLI-D-16-0242.1, 2016. a

Davini, P., Fabiano, F., and Sandu, I.: Orographic resolution driving the improvements associated with horizontal resolution increase in the Northern Hemisphere winter mid-latitudes, Weather Clim. Dynam., 3, 535–553, https://doi.org/10.5194/wcd-3-535-2022, 2022. a

Diogoul, N., Brehmer, P., Demarcq, H., El Ayoubi, S., Thiam, A., Sarre, A., Mouget, A., and Perrot, Y.: On the robustness of an eastern boundary upwelling ecosystem exposed to multiple stressors, Scientific Reports, 11, 1908, https://doi.org/10.1038/s41598-021-81549-1, 2021. a

Dipankar, A., Stevens, B., Heinze, R., Moseley, C., Zängl, G., Giorgetta, M., and Brdar, S.: Large eddy simulation using the general circulation model ICON, Journal of Advances in Modeling Earth Systems, 7, 963–986, https://doi.org/10.1002/2015MS000431, 2015. a

Dirmeyer, P. A., Gao, X., Zhao, M., Guo, Z., Oki, T., and Hanasaki, N.: GSWP-2: Multimodel Analysis and Implications for Our Perception of the Land Surface, Bulletin of the American Meteorological Society, 87, 1381–1398, https://doi.org/10.1175/BAMS-87-10-1381, 2006. a

Dolores-Tesillos, E., Martius, O., and Quinting, J.: On the role of moist and dry processes in atmospheric blocking biases in the Euro-Atlantic region in CMIP6, Weather Clim. Dynam., 6, 471–487, https://doi.org/10.5194/wcd-6-471-2025, 2025. a

ECMWF: IFS Documentation CY46R1 – Part IV: Physical Processes, 4, ECMWF, https://doi.org/10.21957/xphfxep8c, 2019. a

ECMWF: IFS Documentation CY47R3 – Part IV Physical processes, 4, ECMWF, https://doi.org/10.21957/eyrpir4vj, 2021. a

ECMWF: IFS Documentation CY48R1 – Part III: Dynamics and Numerical Procedures, 3, ECMWF, https://doi.org/10.21957/26f0ad3473, 2023a. a, b

ECMWF: IFS Documentation CY48R1 – Part IV: Physical Processes, 4, ECMWF, https://doi.org/10.21957/02054f0fbf, 2023b. a, b, c, d, e

ECMWF: Atos Sequana XH2000, https://www.ecmwf.int/en/computing/our-facilities/supercomputer-facility (last access: 2 October 2025), 2024. a

Fielding, M., Schäfer, S., Hogan, R., and Forbes, R.: Parametrizing cloud geometry and its application in a subgrid cloud-edge erosion scheme, Q. J. Roy. Meteor. Soc., 146, 1651–1667, https://doi.org/10.1002/qj.3758, 2020. a

Gent, P. R. and Mcwilliams, J. C.: Isopycnal Mixing in Ocean Circulation Models, Journal of Physical Oceanography, 20, 150–155, https://doi.org/10.1175/1520-0485(1990)020<0150:IMIOCM>2.0.CO;2, 1990. a

Gent, P. R., Willebrand, J., McDougall, T. J., and McWilliams, J. C.: Parameterizing Eddy-Induced Tracer Transports in Ocean Circulation Models, Journal of Physical Oceanography, 25, 463–474, https://doi.org/10.1175/1520-0485(1995)025<0463:PEITTI>2.0.CO;2, 1995. a

Gentry, M. S. and Lackmann, G. M.: Sensitivity of Simulated Tropical Cyclone Structure and Intensity to Horizontal Resolution, Monthly Weather Review, 138, 688–704, https://doi.org/10.1175/2009MWR2976.1, 2010. a

Gettelman, A., Fox-Kemper, B., Flato, G., Klocke, D., Stamer, D., Stevens, B., and Vidale, P. L.: Kilometre-Scale Modelling of the Earth System: A New Paradigm for Climate Prediction, World Meteorological Organization (WMO), https://wmo.int/media/magazine-article/kilometre-scale-modelling-of-earth-system-new-paradigm-climate-prediction (last access: 2 October 2025), 2023. a

Górski, K. M., Hivon, E., Banday, A. J., Wandelt, B. D., Hansen, F. K., Reinecke, M., and Bartelmann, M.: HEALPix: A Framework for High-Resolution Discretization and Fast Analysis of Data Distributed on the Sphere, The Astrophysical Journal, 622, 759, https://doi.org/10.1086/427976, 2005. a, b

Griffies, S. M., Winton, M., Anderson, W. G., Benson, R., Delworth, T. L., Dufour, C. O., Dunne, J. P., Goddard, P., Morrison, A. K., Rosati, A., Wittenberg, A. T., Yin, J., and Zhang, R.: Impacts on Ocean Heat from Transient Mesoscale Eddies in a Hierarchy of Climate Models, Journal of Climate, 28, 952–977, https://doi.org/10.1175/JCLI-D-14-00353.1, 2015. a

Hanke, M., Redler, R., Holfeld, T., and Yastremsky, M.: YAC 1.2.0: new aspects for coupling software in Earth system modelling, Geosci. Model Dev., 9, 2755–2769, https://doi.org/10.5194/gmd-9-2755-2016, 2016. a, b

Hersbach, H., Bell, B., Berrisford, P., Hirahara, S., Horányi, A., Muñoz-Sabater, J., Nicolas, J., Peubey, C., Radu, R., Schepers, D., Simmons, A., Soci, C., Abdalla, S., Abellan, X., Balsamo, G., Bechtold, P., Biavati, G., Bidlot, J., Bonavita, M., De Chiara, G., Dahlgren, P., Dee, D., Diamantakis, M., Dragani, R., Flemming, J., Forbes, R., Fuentes, M., Geer, A., Haimberger, L., Healy, S., Hogan, R. J., Hólm, E., Janisková, M., Keeley, S., Laloyaux, P., Lopez, P., Lupu, C., Radnoti, G., de Rosnay, P., Rozum, I., Vamborg, F., Villaume, S., and Thépaut, J.-N.: The ERA5 global reanalysis, Quarterly Journal of the Royal Meteorological Society, 146, 1999–2049, https://doi.org/10.1002/qj.3803, 2020. a, b

Hewitt, H., Fox-Kemper, B., Pearson, B., Roberts, M., and Klocke, D.: The small scales of the ocean may hold the key to surprises, Nature Climate Change, 12, 496–499, https://doi.org/10.1038/s41558-022-01386-6, 2022. a

HLRE-4 Levante: Levante HPC System, Deutsches Klimarechenzentrum (DKRZ), https://www.dkrz.de/en/systems/hpc/hlre-4-levante (last access: 2 October 2025), 2024. a, b, c

Hoffmann, J., Bauer, P., Sandu, I., Wedi, N., Geenen, T., and Thiemert, D.: Destination Earth – A digital twin in support of climate services, Climate Services, 30, 100394, https://doi.org/10.1016/j.cliser.2023.100394, 2023. a

Hohenegger, C., Brockhaus, P., Bretherton, C. S., and Schär, C.: The Soil Moisture – Precipitation Feedback in Simulations with Explicit and Parameterized Convection, Journal of Climate, 22, 5003–5020, https://doi.org/10.1175/2009JCLI2604.1, 2009. a

Hohenegger, C., Korn, P., Linardakis, L., Redler, R., Schnur, R., Adamidis, P., Bao, J., Bastin, S., Behravesh, M., Bergemann, M., Biercamp, J., Bockelmann, H., Brokopf, R., Brüggemann, N., Casaroli, L., Chegini, F., Datseris, G., Esch, M., George, G., Giorgetta, M., Gutjahr, O., Haak, H., Hanke, M., Ilyina, T., Jahns, T., Jungclaus, J., Kern, M., Klocke, D., Kluft, L., Kölling, T., Kornblueh, L., Kosukhin, S., Kroll, C., Lee, J., Mauritsen, T., Mehlmann, C., Mieslinger, T., Naumann, A. K., Paccini, L., Peinado, A., Praturi, D. S., Putrasahan, D., Rast, S., Riddick, T., Roeber, N., Schmidt, H., Schulzweida, U., Schütte, F., Segura, H., Shevchenko, R., Singh, V., Specht, M., Stephan, C. C., von Storch, J.-S., Vogel, R., Wengel, C., Winkler, M., Ziemen, F., Marotzke, J., and Stevens, B.: ICON-Sapphire: simulating the components of the Earth system and their interactions at kilometer and subkilometer scales, Geosci. Model Dev., 16, 779–811, https://doi.org/10.5194/gmd-16-779-2023, 2023. a, b, c, d, e, f, g

Holloway, C. E., Woolnough, S. J., and Lister, G. M. S.: The Effects of Explicit versus Parameterized Convection on the MJO in a Large-Domain High-Resolution Tropical Case Study. Part I: Characterization of Large-Scale Organization and Propagation, Journal of the Atmospheric Sciences, 70, 1342–1369, https://doi.org/10.1175/JAS-D-12-0227.1, 2013. a

Huffman, G., Stocker, E., Bolvin, D., Nelkin, E., and Tan, J.: GPM IMERG Final Precipitation L3 Half Hourly 0.1 degree x 0.1 degree V06, GES DISC [data set], https://doi.org/10.5067/GPM/IMERG/3B-HH/06, 2019. a, b, c

ICON partnership (DWD, MPI-M, DKRZ, KIT, C2SM): ICON release 2024.01, WDC Climate [code], https://doi.org/10.35089/WDCC/IconRelease01, 2024. a

Ilyina, T., Six, K. D., Segschneider, J., Maier-Reimer, E., Li, H., and Núñez-Riboni, I.: Global ocean biogeochemistry model HAMOCC: Model architecture and performance as component of the MPI-Earth system model in different CMIP5 experimental realizations, Journal of Advances in Modeling Earth Systems, 5, 287–315, https://doi.org/10.1029/2012MS000178, 2013. a

Janssen, P.: The Interaction of Ocean Waves and Wind, Cambridge University Press, Cambridge, ISBN 9780521465403, https://doi.org/10.1017/CBO9780511525018, 2004. a

Jian, B., Li, J., Wang, G., Zhao, Y., Li, Y., Wang, J., Zhang, M., and Huang, J.: Evaluation of the CMIP6 marine subtropical stratocumulus cloud albedo and its controlling factors, Atmos. Chem. Phys., 21, 9809–9828, https://doi.org/10.5194/acp-21-9809-2021, 2021. a

Judt, F., Klocke, D., Rios-Berrios, R., Vanniere, B., Ziemen, F., Auger, L., Biercamp, J., Bretherton, C., Chen, X., Düben, P., Hohenegger, C., Khairoutdinov, M., Kodama, C., Kornblueh, L., Lin, S.-J., Nakano, M., Neumann, P., Putman, W., Röber, N., Roberts, M., Satoh, M., Shibuya, R., Stevens, B., Vidale, P. L., Wedi, N., and Zhou, L.: Tropical Cyclones in Global Storm-Resolving Models, Journal of the Meteorological Society of Japan. Ser. II, 99, 579–602, https://doi.org/10.2151/jmsj.2021-029, 2021. a

Klein, S. A. and Hartmann, D. L.: The seasonal cycle of low stratiform clouds, Journal of Climate, 6, 1587–1606, 1993. a

Koldunov, N., Kölling, T., Pedruzo-Bagazgoitia, X., Rackow, T., Redler, R., Sidorenko, D., Wieners, K.-H., and Ziemen, F. A.: nextGEMS: output of the model development cycle 3 simulations for ICON and IFS, World Data Center for Climate [data set] https://doi.org/10.26050/WDCC/nextGEMS_cyc3, 2023. a, b, c

Kollet, S. J. and Maxwell, R. M.: Integrated surface-groundwater flow modeling: A free-surface overland flow boundary condition in a parallel groundwater flow model, Advances in Water Resources, 29, 945–958, https://doi.org/10.1016/j.advwatres.2005.08.006, 2006. a

Korn, P., Brüggemann, N., Jungclaus, J. H., Lorenz, S. J., Gutjahr, O., Haak, H., Linardakis, L., Mehlmann, C., Mikolajewicz, U., Notz, D., Putrasahan, D. A., Singh, V., von Storch, J.-S., Zhu, X., and Marotzke, J.: ICON-O: The Ocean Component of the ICON Earth System Model – Global Simulation Characteristics and Local Telescoping Capability, Journal of Advances in Modeling Earth Systems, 14, e2021MS002952, https://doi.org/10.1029/2021MS002952, 2022. a, b

Koster, R. D., Dirmeyer, P. A., Guo, Z., Bonan, G., Chan, E., Cox, P., Gordon, C. T., Kanae, S., Kowalczyk, E., Lawrence, D., Liu, P., Lu, C.-H., Malyshev, S., McAvaney, B., Mitchell, K., Mocko, D., Oki, T., Oleson, K., Pitman, A., Sud, Y. C., Taylor, C. M., Verseghy, D., Vasic, R., Xue, Y., and Yamada, T.: Regions of Strong Coupling Between Soil Moisture and Precipitation, Science, 305, 1138–1140, https://doi.org/10.1126/science.1100217, 2004. a

Kuang, Z.: Linear response functions of a cumulus ensemble to temperature and moisture perturbations and implications for the dynamics of convectively coupled waves, Journal of the Atmospheric Sciences, 67, 941–962, https://doi.org/10.1175/2009JAS3260.1, 2010. a

Kölling, T., Kluft, L., and Rackow, T.: gribscan (v0.0.10), Zenodo [code], https://doi.org/10.5281/zenodo.10625189, 2024. a

Lee, J. and Hohenegger, C.: Weaker land–atmosphere coupling in global storm-resolving simulation, Proceedings of the National Academy of Sciences, 121, e2314265121, https://doi.org/10.1073/pnas.2314265121, 2024. a, b

Lee, J., Hohenegger, C., Chlond, A., and Schnur, R.: The Climatic Role of Interactive Leaf Phenology in the Vegetation- Atmosphere System of Radiative-Convective Equilibrium Storm-Resolving Simulations, Tellus B: Chemical and Physical Meteorology, https://doi.org/10.16993/tellusb.26, 2022. a

Leuenberger, D., Koller, M., Fuhrer, O., and Schär, C.: A Generalization of the SLEVE Vertical Coordinate, Monthly Weather Review, 138, 3683–3689, https://doi.org/10.1175/2010MWR3307.1, 2010. a

Lin, J. L.: The double-ITCZ problem in IPCC AR4 coupled GCMs: Ocean-atmosphere feedback analysis, Journal of Climate, 20, 4497–4525, https://doi.org/10.1175/JCLI4272.1, 2007. a, b

Loeb, N. G., Doelling, D. R., Wang, H., Su, W., Nguyen, C., Corbett, J. G., Liang, L., Mitrescu, C., Rose, F. G., and Kato, S.: Clouds and the Earth's Radiant Energy System (CERES) Energy Balanced and Filled (EBAF) Top-of-Atmosphere (TOA) Edition-4.0 Data Product, Journal of Climate, 31, 895–918, https://doi.org/10.1175/JCLI-D-17-0208.1, 2018. a, b, c, d, e

Louis, J.-F.: A parametric model of vertical eddy fluxes in the atmosphere, Boundary-Layer Meteorology, 17, 187–202, https://doi.org/10.1007/BF00117978, 1979. a

LUMI: Large Unified Modern Infrastructure, https://www.lumi-supercomputer.eu (last access: 2 October 2025), 2024. a

Maciel, P., Quintino, T., Modigliani, U., Dando, P., Raoult, B., Deconinck, W., Rathgeber, F., and Simarro, C.: The new ECMWF interpolation package MIR, ECMWF, https://doi.org/10.21957/h20rz8, 2017. a, b

Madec, G. and the NEMO System Team: NEMO Ocean Engine Reference Manual, Zenodo, https://doi.org/10.5281/zenodo.8167700, 2023. a

Malardel, S. and Wedi, N. P.: How does subgrid-scale parametrization influence nonlinear spectral energy fluxes in global NWP models?, Journal of Geophysical Research: Atmospheres, 121, 5395–5410, https://doi.org/10.1002/2015JD023970, 2016. a

Maltrud, M. E. and McClean, J. L.: An eddy resolving global 1/10° ocean simulation, Ocean Modelling, 8, 31–54, https://doi.org/10.1016/j.ocemod.2003.12.001, 2005. a

Mauritsen, T., Stevens, B., Roeckner, E., Crueger, T., Esch, M., Giorgetta, M., Haak, H., Jungclaus, J., Klocke, D., Matei, D., Mikolajewicz, U., Notz, D., Pincus, R., Schmidt, H., and Tomassini, L.: Tuning the climate of a global model, Journal of Advances in Modeling Earth Systems, 4, https://doi.org/10.1029/2012MS000154, 2012. a, b

Mauritsen, T., Redler, R., Esch, M., Stevens, B., Hohenegger, C., Klocke, D., Brokopf, R., Haak, H., Linardakis, L., Röber, N., and Schnur, R.: Early Development and Tuning of a Global Coupled Cloud Resolving Model, and its Fast Response to Increasing CO2, Tellus A: Dynamic Meteorology and Oceanography, https://doi.org/10.16993/tellusa.54, 2022. a, b, c

McNorton, J., Agustí-Panareda, A., Arduini, G., Balsamo, G., Bousserez, N., Boussetta, S., Chericoni, M., Choulga, M., Engelen, R., and Guevara, M.: An Urban Scheme for the ECMWF Integrated Forecasting System: Global Forecasts and Residential CO2 Emissions, Journal of Advances in Modeling Earth Systems, 15, e2022MS003286, https://doi.org/10.1029/2022MS003286, 2023. a

McNorton, J. R., Arduini, G., Bousserez, N., Agustí-Panareda, A., Balsamo, G., Boussetta, S., Choulga, M., Hadade, I., and Hogan, R. J.: An Urban Scheme for the ECMWF Integrated Forecasting System: Single-Column and Global Offline Application, Journal of Advances in Modeling Earth Systems, 13, e2020MS002375, https://doi.org/10.1029/2020MS002375, 2021. a

Mechoso, C. R., Robertson, A. W., Barth, N., Davey, M. K., Delecluse, P., Gent, P. R., Ineson, S., Kirtman, B., Latif, M., Treut, H. L., Nagai, T., Neelin, J. D., Philander, S. G. H., Polcher, J., Schopf, P. S., Stockdale, T., Suarez, M. J., Terray, L., Thual, O., and Tribbia, J. J.: The Seasonal Cycle over the Tropical Pacific in Coupled Ocean – Atmosphere General Circulation Models, Monthly Weather Review, 123, 2825–2838, https://doi.org/10.1175/1520-0493(1995)123<2825:TSCOTT>2.0.CO;2, 1995. a

Miura, H., Satoh, M., Nasuno, T., Noda, A. T., and Oouchi, K.: A Madden-Julian Oscillation Event Realistically Simulated by a Global Cloud-Resolving Model, Science, 318, 1763–1765, https://doi.org/10.1126/science.1148443, 2007. a

Morice, C. P., Kennedy, J. J., Rayner, N. A., Winn, J. P., Hogan, E., Killick, R. E., Dunn, R. J. H., Osborn, T. J., Jones, P. D., and Simpson, I. R.: An Updated Assessment of Near-Surface Temperature Change From 1850: The HadCRUT5 Data Set, Journal of Geophysical Research: Atmospheres, 126, e2019JD032361, https://doi.org/10.1029/2019JD032361, 2021. a, b, c

Nowak, J. L., Siebert, H., Szodry, K.-E., and Malinowski, S. P.: Coupled and decoupled stratocumulus-topped boundary layers: turbulence properties, Atmos. Chem. Phys., 21, 10965–10991, https://doi.org/10.5194/acp-21-10965-2021, 2021. a

Nowak, J. L., Dragaud, I. C., Lee, J., Dziekan, P., Mellado, J. P., and Stevens, B.: A first look at the global climatology of low-level clouds in storm resolving models, Journal of Advances in Modeling Earth Systems, 17, e2024MS004340, https://doi.org/10.1029/2024MS004340, 2025. a

O'Neill, B. C., Tebaldi, C., van Vuuren, D. P., Eyring, V., Friedlingstein, P., Hurtt, G., Knutti, R., Kriegler, E., Lamarque, J.-F., Lowe, J., Meehl, G. A., Moss, R., Riahi, K., and Sanderson, B. M.: The Scenario Model Intercomparison Project (ScenarioMIP) for CMIP6, Geosci. Model Dev., 9, 3461–3482, https://doi.org/10.5194/gmd-9-3461-2016, 2016. a, b

Penduff, T., Juza, M., Brodeau, L., Smith, G. C., Barnier, B., Molines, J.-M., Treguier, A.-M., and Madec, G.: Impact of global ocean model resolution on sea-level variability with emphasis on interannual time scales, Ocean Sci., 6, 269–284, https://doi.org/10.5194/os-6-269-2010, 2010. a

Peters, K., Hohenegger, C., and Klocke, D.: Different representation of mesoscale convective systems in convection-permitting and convection-parameterizing NWP models and its implications for large-scale forecast evolution, Atmosphere (Basel), 10, 1–18, https://doi.org/10.3390/atmos10090503, 2019. a

Pincus, R., Mlawer, E. J., and Delamere, J. S.: Balancing Accuracy, Efficiency, and Flexibility in Radiation Calculations for Dynamical Models, Journal of Advances in Modeling Earth Systems, 11, 3074–3089, https://doi.org/10.1029/2019MS001621, 2019. a, b, c

Prein, A. F., Langhans, W., Fosser, G., Ferrone, A., Ban, N., Goergen, K., Keller, M., Tölle, M., Gutjahr, O., Feser, F., Brisson, E., Kollet, S., Schmidli, J., van Lipzig, N. P. M., and Leung, R.: A review on regional convection-permitting climate modeling: Demonstrations, prospects, and challenges, Reviews of Geophysics, 53, 323–361, https://doi.org/10.1002/2014RG000475, 2015. a

Prein, A. F., Rasmussen, R., and Stephens, G.: Challenges and advances in convection-permitting climate modeling, Bulletin of the American Meteorological Society, 98, 1027–1030, https://doi.org/10.1175/BAMS-D-16-0263.1, 2017. a

Prein, A. F., Rasmussen, R., Castro, C. L., Dai, A., and Minder, J.: Special issue: Advances in convection-permitting climate modeling, Climate Dynamics, 55, 1–2, https://doi.org/10.1007/s00382-020-05240-3, 2020. a

Proske, U., Brüggemann, N., Gärtner, J. P., Gutjahr, O., Haak, H., Putrasahan, D., and Wieners, K.-H.: A case for open communication of bugs in climate models, made with ICON version 2024.01, EGUsphere [preprint], https://doi.org/10.5194/egusphere-2024-3493, 2024. a

Rackow, T., Danilov, S., Goessling, H. F., Hellmer, H. H., Sein, D. V., Semmler, T., Sidorenko, D., and Jung, T.: Delayed Antarctic sea-ice decline in high-resolution climate change simulations, Nature Communications, 13, 637, https://doi.org/10.1038/s41467-022-28259-y, 2022. a

Rackow, T., Becker, T., Forbes, R., and Fielding, M.: Source code changes to the Integrated Forecasting System (IFS) for nextGEMS simulations, Zenodo [code], https://doi.org/10.5281/zenodo.10223577, 2023a. a

Rackow, T., Hegewald, J., Koldunov, N. V., Mogensen, K., Scholz, P., Sidorenko, D., and Streffing, J.: FESOM2.5 source code used in nextGEMS Cycle 3 simulations with IFS-FESOM, Zenodo [code], https://doi.org/10.5281/zenodo.10225420, 2023b. a

Rackow, T., Becker, T., and Pedruzo-Bagazgoitia, X.: Namelist files and settings for 30-year km-scale nextGEMS Cycle 4 production simulations with IFS-FESOM, Zenodo [data set] https://doi.org/10.5281/zenodo.14725225, 2025a. a

Rackow, T., Pedruzo-Bagazgoitia, X., Becker, T., Milinski, S., Sandu, I., Aguridan, R., Bechtold, P., Beyer, S., Bidlot, J., Boussetta, S., Deconinck, W., Diamantakis, M., Dueben, P., Dutra, E., Forbes, R., Ghosh, R., Goessling, H. F., Hadade, I., Hegewald, J., Jung, T., Keeley, S., Kluft, L., Koldunov, N., Koldunov, A., Kölling, T., Kousal, J., Kühnlein, C., Maciel, P., Mogensen, K., Quintino, T., Polichtchouk, I., Reuter, B., Sármány, D., Scholz, P., Sidorenko, D., Streffing, J., Sützl, B., Takasuka, D., Tietsche, S., Valentini, M., Vannière, B., Wedi, N., Zampieri, L., and Ziemen, F.: Multi-year simulations at kilometre scale with the Integrated Forecasting System coupled to FESOM2.5 and NEMOv3.4, Geosci. Model Dev., 18, 33–69, https://doi.org/10.5194/gmd-18-33-2025, 2025b. a, b, c, d, e, f, g, h, i, j, k, l, m, n

Rayner, N. A., Parker, D. E., Horton, E. B., Folland, C. K., Alexander, L. V., Rowell, D. P., Kent, E. C., and Kaplan, A.: Global analyses of sea surface temperature, sea ice, and night marine air temperature since the late nineteenth century, Journal of Geophysical Research: Atmospheres, 108, https://doi.org/10.1029/2002JD002670, 2003. a

Reick, C. H., Gayler, V., Goll, D., Hagemann, S., Heidkamp, M., Nabel, J., Raddatz, T., Roeckner, E., Schnur, R., and Wilkenskjeld, S.: JSACH 3 – The land component of the MPI Earth System Model: Documentation of version 3.2, Tech. Rep. 240, Max-Planck-Institut für Meteorologie, https://doi.org/10.17617/2.3279802, 2021. a, b

Roussillon, J., Fablet, R., Gorgues, T., Drumetz, L., Littaye, J., and Martinez, E.: A Multi-Mode Convolutional Neural Network to reconstruct satellite-derived chlorophyll-a time series in the global ocean from physical drivers, Frontiers in Marine Science, 10, https://doi.org/10.3389/fmars.2023.1077623, 2023. a

Sanderson, B. M. and Rugenstein, M.: Potential for bias in effective climate sensitivity from state-dependent energetic imbalance, Earth Syst. Dynam., 13, 1715–1736, https://doi.org/10.5194/esd-13-1715-2022, 2022. a

Sandu, I.: Destination Earth’s digital twins and Digital Twin Engine – state of play, ECMWF, https://doi.org/10.21957/is1fc736jx, 2024. a

Sandu, I., van Niekerk, A., Shepherd, T. G., Vosper, S. B., Zadra, A., Bacmeister, J., Beljaars, A., Brown, A. R., Dörnbrack, A., McFarlane, N., Pithan, F., and Svensson, G.: Impacts of orography on large-scale atmospheric circulation, npj Climate and Atmospheric Science, 2, 10, https://doi.org/10.1038/s41612-019-0065-9, 2019. a

Sarmany, D., Valentini, M., Maciel, P., Geier, P., Smart, S., Aguridan, R., Hawkes, J., and Quintino, T.: MultIO: A Framework for Message-Driven Data Routing For Weather and Climate Simulations, in: Proceedings of the Platform for Advanced Scientific Computing Conference, PASC '24, Association for Computing Machinery, New York, NY, USA, ISBN 9798400706394, https://doi.org/10.1145/3659914.3659938, 2024. a, b, c

Sármány, D., Geier, P., Valentini, M., Quintino, T., Nobel, K., Smart, S., Reuter, S., Rathgeber, F., Maciel, P., Tuma, V., Cook, H., Iffrig-Petit, O., Aguridan, R., Raoult, B., Deconinck, W., Danovaro, E., Vuckovic, D., Recman, J., and Bento, M.: .ecmwf/multio: 2.1.4 (2.1.4), Zenodo [code], https://doi.org/10.5281/zenodo.17302459, 2024. 

Sarre, A., Demarcq, H., Keenlyside, N., Krakstad, J.-O., El Ayoubi, S., Jeyid, A. M., Faye, S., Mbaye, A., Sidibeh, M., and Brehmer, P.: Climate change impacts on small pelagic fish distribution in Northwest Africa: trends, shifts, and risk for food security, Scientific Reports, 14, 12684, https://doi.org/10.1038/s41598-024-61734-8, 2024. a

Satoh, M., Matsuno, T., Tomita, H., Miura, H., Nasuno, T., and Iga, S.: Nonhydrostatic icosahedral atmospheric model (NICAM) for global cloud resolving simulations, Journal of Computational Physics, 227, 3486–3514, https://doi.org/10.1016/j.jcp.2007.02.006, 2008. a

Schär, C., Fuhrer, O., Arteaga, A., Ban, N., Charpilloz, C., Girolamo, S. D., Hentgen, L., Hoefler, T., Lapillonne, X., Leutwyler, D., Osterried, K., Panosetti, D., Rüdisühli, S., Schlemmer, L., Schulthess, T. C., Sprenger, M., Ubbiali, S., and Wernli, H.: Kilometer-Scale Climate Models: Prospects and Challenges, Bulletin of the American Meteorological Society, 101, E567–E587, https://doi.org/10.1175/BAMS-D-18-0167.1, 2020. a

Schiemann, R., Demory, M.-E., Shaffrey, L. C., Strachan, J., Vidale, P. L., Mizielinski, M. S., Roberts, M. J., Matsueda, M., Wehner, M. F., and Jung, T.: The resolution sensitivity of Northern Hemisphere blocking in four 25-km atmospheric global circulation models, Journal of Climate, 30, 337–358, https://doi.org/10.1175/JCLI-D-16-0100.1, 2017. a

Schiemann, R., Athanasiadis, P., Barriopedro, D., Doblas-Reyes, F., Lohmann, K., Roberts, M. J., Sein, D. V., Roberts, C. D., Terray, L., and Vidale, P. L.: Northern Hemisphere blocking simulation in current climate models: evaluating progress from the Climate Model Intercomparison Project Phase 5 to 6 and sensitivity to resolution, Weather Clim. Dynam., 1, 277–292, https://doi.org/10.5194/wcd-1-277-2020, 2020. a

Scholz, P., Sidorenko, D., Gurses, O., Danilov, S., Koldunov, N., Wang, Q., Sein, D., Smolentseva, M., Rakowsky, N., and Jung, T.: Assessment of the Finite-volumE Sea ice-Ocean Model (FESOM2.0) – Part 1: Description of selected key model elements and comparison to its predecessor version, Geosci. Model Dev., 12, 4875–4899, https://doi.org/10.5194/gmd-12-4875-2019, 2019. a, b

Schwierz, C., Croci-Maspoli, M., and Davies, H. C.: Perspicacious indicators of atmospheric blocking, Geophysical Research Letters, 31, https://doi.org/10.1029/2003GL019341, 2004. a

Segura, H., Hohenegger, C., Wengel, C., and Stevens, B.: Learning by Doing: Seasonal and Diurnal Features of Tropical Precipitation in a Global-Coupled Storm-Resolving Model, Geophys. Res. Lett., 49, 1–10, https://doi.org/10.1029/2022GL101796, 2022. a, b

Segura, H., Bayley, C., Fievét, R., Glöckner, H., Günther, M., Kluft, L., Naumann, A. K., Ortega, S., Praturi, D. S., Rixen, M., Schmidt, H., Winkler, M., Hohenegger, C., and Stevens, B.: A Single Tropical Rainbelt in Global Storm‐Resolving Models: The Role of Surface Heat Fluxes Over the Warm Pool, Journal of Advances in Modeling Earth Systems, 17, https://doi.org/10.1029/2024MS004897, 2025a. a, b, c

Segura, H., Pedruzo-Bagazgoitia, X., Weiss, P., Müller, S. K., Rackow, T., Lee, J., Dolores Tesillos, E., and Benedict, I.: nextGEMS overview paper: scripts, Edmond [code], https://doi.org/10.17617/3.QZHXMC, 2025b. a

Senf, F., Klocke, D., and Brueck, M.: Size-Resolved Evaluation of Simulated Deep Tropical Convection, Monthly Weather Review, 146, 2161–2182, https://doi.org/10.1175/MWR-D-17-0378.1, 2018. a

Simmons, A. J. and Strüfing, R.: Numerical forecasts of stratospheric warming events using a model with a hybrid vertical coordinate, Quarterly Journal of the Royal Meteorological Society, 109, 81–111, https://doi.org/10.1002/qj.49710945905, 1983. a

Slingo, J., Bates, P., Bauer, P., Belcher, S., Palmer, T., Stephens, G., Stevens, B., Stocker, T., and Teutsch, G.: Ambitious partnership needed for reliable climate prediction, Nature Climate Change, 12, 499–503, https://doi.org/10.1038/s41558-022-01384-8, 2022. a

Stenchikov, G. L., Kirchner, I., Robock, A., Graf, H.-F., Antuña, J. C., Grainger, R. G., Lambert, A., and Thomason, L.: Radiative forcing from the 1991 Mount Pinatubo volcanic eruption, Journal of Geophysical Research: Atmospheres, 103, 13837–13857, https://doi.org/10.1029/98JD00693, 1998. a, b

Stevens, B., Fiedler, S., Kinne, S., Peters, K., Rast, S., Müsse, J., Smith, S. J., and Mauritsen, T.: MACv2-SP: a parameterization of anthropogenic aerosol optical properties and an associated Twomey effect for use in CMIP6, Geosci. Model Dev., 10, 433–452, https://doi.org/10.5194/gmd-10-433-2017, 2017. a, b

Stevens, B., Adami, S., Ali, T., Anzt, H., Aslan, Z., Attinger, S., Bäck, J., Baehr, J., Bauer, P., Bernier, N., Bishop, B., Bockelmann, H., Bony, S., Brasseur, G., Bresch, D. N., Breyer, S., Brunet, G., Buttigieg, P. L., Cao, J., Castet, C., Cheng, Y., Dey Choudhury, A., Coen, D., Crewell, S., Dabholkar, A., Dai, Q., Doblas-Reyes, F., Durran, D., El Gaidi, A., Ewen, C., Exarchou, E., Eyring, V., Falkinhoff, F., Farrell, D., Forster, P. M., Frassoni, A., Frauen, C., Fuhrer, O., Gani, S., Gerber, E., Goldfarb, D., Grieger, J., Gruber, N., Hazeleger, W., Herken, R., Hewitt, C., Hoefler, T., Hsu, H.-H., Jacob, D., Jahn, A., Jakob, C., Jung, T., Kadow, C., Kang, I.-S., Kang, S., Kashinath, K., Kleinen-von Königslöw, K., Klocke, D., Kloenne, U., Klöwer, M., Kodama, C., Kollet, S., Kölling, T., Kontkanen, J., Kopp, S., Koran, M., Kulmala, M., Lappalainen, H., Latifi, F., Lawrence, B., Lee, J. Y., Lejeun, Q., Lessig, C., Li, C., Lippert, T., Luterbacher, J., Manninen, P., Marotzke, J., Matsouoka, S., Merchant, C., Messmer, P., Michel, G., Michielsen, K., Miyakawa, T., Müller, J., Munir, R., Narayanasetti, S., Ndiaye, O., Nobre, C., Oberg, A., Oki, R., Özkan-Haller, T., Palmer, T., Posey, S., Prein, A., Primus, O., Pritchard, M., Pullen, J., Putrasahan, D., Quaas, J., Raghavan, K., Ramaswamy, V., Rapp, M., Rauser, F., Reichstein, M., Revi, A., Saluja, S., Satoh, M., Schemann, V., Schemm, S., Schnadt Poberaj, C., Schulthess, T., Senior, C., Shukla, J., Singh, M., Slingo, J., Sobel, A., Solman, S., Spitzer, J., Stier, P., Stocker, T., Strock, S., Su, H., Taalas, P., Taylor, J., Tegtmeier, S., Teutsch, G., Tompkins, A., Ulbrich, U., Vidale, P.-L., Wu, C.-M., Xu, H., Zaki, N., Zanna, L., Zhou, T., and Ziemen, F.: Earth Virtualization Engines (EVE), Earth Syst. Sci. Data, 16, 2113–2122, https://doi.org/10.5194/essd-16-2113-2024, 2024. a

Stockdale, T., Senan, R., Hogan, R., Kipling, Z., and Flemming, J.: A new time-varying tropospheric aerosol climatology for the IFS, ECMWF, https://doi.org/10.21957/ba371f56ts, 2024. a

Takasuka, D., Kodama, C., Suematsu, T., Ohno, T., Yamada, Y., Seiki, T., Yashiro, H., Nakano, M., Miura, H., Noda, A. T., Nasuno, T., Miyakawa, T., and Masunaga, R.: How Can We Improve the Seamless Representation of Climatological Statistics and Weather Toward Reliable Global K-Scale Climate Simulations?, Journal of Advances in Modeling Earth Systems, 16, https://doi.org/10.1029/2023MS003701, 2024. a, b, c

Tian, Y. and Kuang, Z.: Why does deep convection have different sensitivities to temperature perturbations in the lower versus upper troposphere?, Journal of the Atmospheric Sciences, 76, 27–41, https://doi.org/10.1175/JAS-D-18-0023.1, 2019. a

van Westen, R. M. and Dijkstra, H. A.: Ocean eddies strongly affect global mean sea-level projections, Science Advances, 7, eabf1674, https://doi.org/10.1126/sciadv.abf1674, 2021. a

Wang, Q., Danilov, S., Sidorenko, D., Timmermann, R., Wekerle, C., Wang, X., Jung, T., and Schröter, J.: The Finite Element Sea Ice-Ocean Model (FESOM) v.1.4: formulation of an ocean general circulation model, Geosci. Model Dev., 7, 663–693, https://doi.org/10.5194/gmd-7-663-2014, 2014. a

Wazneh, H., Gachon, P., Laprise, R., de Vernal, A., and Tremblay, B.: Atmospheric blocking events in the North Atlantic: trends and links to climate anomalies and teleconnections, Climate Dynamics, 56, 2199–2221, https://doi.org/10.1007/s00382-020-05583-x, 2021. a

Weiss, P., Herbert, R., and Stier, P.: ICON-HAM-lite 1.0: simulating the Earth system with interactive aerosols at kilometer scales, Geosci. Model Dev., 18, 3877–3894, https://doi.org/10.5194/gmd-18-3877-2025, 2025. a, b

Wieners, K.-H., Ziemen, F. A., Koldunov, N., Pedruzo-Bagazgoitia, X., Rackow, T., Redler, R., Sidorenko, D., and Kölling, T.: nextGEMS: output of the model development cycle 2 simulations for ICON and IFS, World Data Center for Climate [data set], https://doi.org/10.26050/WDCC/nextGEMS_cyc2, 2023. a, b, c

Wieners, K.-H., Rackow, T., Aguridan, R., Becker, T., Beyer, S., Cheedela, S. K., Dreier, N.-A., Engels, J. F., Esch, M., Frauen, C., Klocke, D., Kölling, T., Pedruzo-Bagazgoitia, X., Putrasahan, D., Sidorenko, D., Schnur, R., Stevens, B., and Zimmermann, J.: nextGEMS: output of the production simulations for ICON and IFS, World Data Center for Climate [data set], https://doi.org/10.35095/WDCC/nextGEMS_prod_addinfov1, 2024. a, b, c

Wood, R. and Bretherton, C. S.: On the Relationship between Stratiform Low Cloud Cover and Lower-Tropospheric Stability, Journal of Climate, 19, 6425–6432, https://doi.org/10.1175/JCLI3988.1, 2006. a

Woollings, T., Barriopedro, D., Methven, J., Son, S.-W., Martius, O., Harvey, B., Sillmann, J., Lupo, A. R., and Seneviratne, S.: Blocking and its Response to Climate Change, Current Climate Change Reports, 4, 287–300, https://doi.org/10.1007/s40641-018-0108-z, 2018. a, b, c

Yamazaki, D., Kanae, S., Kim, H., and Oki, T.: A physically based description of floodplain inundation dynamics in a global river routing model, Water Resources Research, 47, https://doi.org/10.1029/2010WR009726, 2011. a

Yamazaki, D., Sato, T., Kanae, S., Hirabayashi, Y., and Bates, P. D.: Regional flood dynamics in a bifurcating mega delta simulated in a global river model, Geophysical Research Letters, 41, 3127–3135, https://doi.org/10.1002/2014GL059744, 2014. a

Zängl, G., Reinert, D., Rípodas, P., and Baldauf, M.: The ICON (ICOsahedral Non-hydrostatic) modelling framework of DWD and MPI-M: Description of the non-hydrostatic dynamical core, Quarterly Journal of the Royal Meteorological Society, 141, 563–579, https://doi.org/10.1002/qj.2378, 2015. a, b

Download
Executive editor
Kilometre-scale global climate models are pivotal for delivering nuanced regional climate insights and informing climate action, though they face the formidable challenge of balancing computational demands with the precision required to simulate complex subgrid processes. This paper is one of the landmarks in climate modelling, demonstrating the potential of kilometre-scale models to enhance regional climate understanding. It overcomes computational hurdles to achieve high-resolution simulations, crucial for capturing mesoscale phenomena. The authors' transparent exploration of challenges and successes makes this a vital read for climate scientists, offering insights into the future of climate modelling and its applications in climate action.
Short summary
The Next Generation of Earth Modeling Systems project (nextGEMS) developed two Earth system models that use horizontal grid spacing of 10 km and finer, giving more fidelity to the representation of local phenomena, globally. In its fourth cycle, nextGEMS simulated the Earth System climate over the 2020–2049 period under the SSP3-7.0 scenario. Here, we provide an overview of nextGEMS, insights into the model development, and the realism of multi-decadal, kilometer-scale simulations.
Share