<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing with OASIS Tables v3.0 20080202//EN" "https://jats.nlm.nih.gov/nlm-dtd/publishing/3.0/journalpub-oasis3.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:oasis="http://docs.oasis-open.org/ns/oasis-exchange/table" xml:lang="en" dtd-version="3.0" article-type="research-article"><?xmltex \bartext{Development and technical paper}?>
  <front>
    <journal-meta><journal-id journal-id-type="publisher">GMD</journal-id><journal-title-group>
    <journal-title>Geoscientific Model Development</journal-title>
    <abbrev-journal-title abbrev-type="publisher">GMD</abbrev-journal-title><abbrev-journal-title abbrev-type="nlm-ta">Geosci. Model Dev.</abbrev-journal-title>
  </journal-title-group><issn pub-type="epub">1991-9603</issn><publisher>
    <publisher-name>Copernicus Publications</publisher-name>
    <publisher-loc>Göttingen, Germany</publisher-loc>
  </publisher></journal-meta>
    <article-meta>
      <article-id pub-id-type="doi">10.5194/gmd-17-4017-2024</article-id><title-group><article-title>Efficient and stable coupling of the SuperdropNet deep-learning-based cloud microphysics (v0.1.0) with<?xmltex \hack{\break}?> the ICON climate and weather model (v2.6.5)</article-title><alt-title>Coupling SuperdropNet with ICON</alt-title>
      </title-group><?xmltex \runningtitle{Coupling SuperdropNet with ICON}?><?xmltex \runningauthor{C. Arnold et al.}?>
      <contrib-group>
        <contrib contrib-type="author" corresp="yes" rid="aff1 aff2 aff3">
          <name><surname>Arnold</surname><given-names>Caroline</given-names></name>
          <email>arnold@dkrz.de</email>
        <ext-link>https://orcid.org/0000-0002-9458-1517</ext-link></contrib>
        <contrib contrib-type="author" corresp="yes" rid="aff2 aff3 aff4">
          <name><surname>Sharma</surname><given-names>Shivani</given-names></name>
          <email>shivani.sharma@hereon.de</email>
        <ext-link>https://orcid.org/0000-0001-6973-5660</ext-link></contrib>
        <contrib contrib-type="author" corresp="no" rid="aff1 aff2 aff3">
          <name><surname>Weigel</surname><given-names>Tobias</given-names></name>
          
        <ext-link>https://orcid.org/0000-0002-4040-0215</ext-link></contrib>
        <contrib contrib-type="author" corresp="no" rid="aff2 aff3">
          <name><surname>Greenberg</surname><given-names>David S.</given-names></name>
          
        </contrib>
        <aff id="aff1"><label>1</label><institution>German Climate Computing Center (DKRZ), Hamburg, Germany</institution>
        </aff>
        <aff id="aff2"><label>2</label><institution>Helmholtz-Zentrum Hereon, Geesthacht, Germany</institution>
        </aff>
        <aff id="aff3"><label>3</label><institution>Helmholtz AI, Munich, Germany</institution>
        </aff>
        <aff id="aff4"><label>4</label><institution>International Max Planck Research School on Earth System Modelling, Hamburg, Germany</institution>
        </aff>
      </contrib-group>
      <author-notes><corresp id="corr1">Caroline Arnold (arnold@dkrz.de) and Shivani Sharma (shivani.sharma@hereon.de)</corresp></author-notes><pub-date><day>16</day><month>May</month><year>2024</year></pub-date>
      
      <volume>17</volume>
      <issue>9</issue>
      <fpage>4017</fpage><lpage>4029</lpage>
      <history>
        <date date-type="received"><day>6</day><month>September</month><year>2023</year></date>
           <date date-type="rev-request"><day>15</day><month>November</month><year>2023</year></date>
           <date date-type="rev-recd"><day>1</day><month>March</month><year>2024</year></date>
           <date date-type="accepted"><day>17</day><month>March</month><year>2024</year></date>
      </history>
      <permissions>
        <copyright-statement>Copyright: © 2024 Caroline Arnold et al.</copyright-statement>
        <copyright-year>2024</copyright-year>
      <license license-type="open-access"><license-p>This work is licensed under the Creative Commons Attribution 4.0 International License. To view a copy of this licence, visit <ext-link ext-link-type="uri" xlink:href="https://creativecommons.org/licenses/by/4.0/">https://creativecommons.org/licenses/by/4.0/</ext-link></license-p></license></permissions><self-uri xlink:href="https://gmd.copernicus.org/articles/17/4017/2024/gmd-17-4017-2024.html">This article is available from https://gmd.copernicus.org/articles/17/4017/2024/gmd-17-4017-2024.html</self-uri><self-uri xlink:href="https://gmd.copernicus.org/articles/17/4017/2024/gmd-17-4017-2024.pdf">The full text article is available as a PDF file from https://gmd.copernicus.org/articles/17/4017/2024/gmd-17-4017-2024.pdf</self-uri>
      <abstract><title>Abstract</title>

      <p id="d1e133">Machine learning (ML) algorithms can be used in Earth system models (ESMs) to emulate sub-grid-scale processes. Due to the statistical nature of ML algorithms and the high complexity of ESMs, these hybrid ML ESMs require careful validation. Simulation stability needs to be monitored in fully coupled simulations, and the plausibility of results needs to be evaluated in suitable experiments.</p>

      <p id="d1e136">We present the coupling of SuperdropNet, a machine learning model for emulating warm-rain processes in cloud microphysics, with ICON (Icosahedral Nonhydrostatic) model v2.6.5. SuperdropNet is trained on computationally expensive droplet-based simulations and can serve as an inexpensive proxy within weather prediction models. SuperdropNet emulates the collision–coalescence of rain and cloud droplets in a warm-rain scenario and  replaces the collision–coalescence process in the two-moment cloud microphysics scheme.</p>

      <p id="d1e139">We address the technical challenge of integrating SuperdropNet, developed in Python and PyTorch, into ICON, written in Fortran, by implementing three different coupling strategies: embedded Python via the C foreign function interface (CFFI), pipes, and coupling of program components via Yet Another Coupler (YAC).</p>

      <p id="d1e142">We validate the emulator in the warm-bubble scenario and find that SuperdropNet runs stably within the experiment. By comparing experiment outcomes of the two-moment bulk scheme with SuperdropNet, we find that the results are physically consistent and discuss differences that are observed in several diagnostic variables.</p>

      <p id="d1e145">In addition, we provide a quantitative and qualitative computational benchmark for three different coupling strategies – embedded Python, coupler YAC, and pipes – and find that embedded Python is a useful software tool for validating hybrid ML ESMs.</p>
  </abstract>
    </article-meta>
  </front>
<body>
      

<sec id="Ch1.S1" sec-type="intro">
  <label>1</label><title>Introduction</title>
      <?pagebreak page4018?><p id="d1e157">Machine learning (ML) is increasingly used in Earth system models (ESMs) to emulate sub-grid-scale processes that are typically parameterized or neglected due to their high computational cost <xref ref-type="bibr" rid="bib1.bibx19 bib1.bibx22 bib1.bibx29 bib1.bibx25" id="paren.1"/>. ML algorithms are statistical algorithms that are trained on data. Neural networks are a widely used class of ML algorithms. They contain trainable parameters (i.e., the weights and biases) that are learned from data by minimizing a cost function. The trained algorithm can then be used for inference (i.e., application on unseen data of the same kind). When sub-grid-scale processes are replaced by ML algorithms, the improvement can aim at speeding up the overall simulation by emulating the existing parameterization. This was first established using neural networks to emulate long-wave radiative transfer <xref ref-type="bibr" rid="bib1.bibx17 bib1.bibx30" id="paren.2"/>. Recent examples include the emulation of the gravity wave drag <xref ref-type="bibr" rid="bib1.bibx16" id="paren.3"/>, the cloud microphysics <xref ref-type="bibr" rid="bib1.bibx15" id="paren.4"/>, the ocean in a coupled climate model <xref ref-type="bibr" rid="bib1.bibx49" id="paren.5"/>, and the cloud radiative effects <xref ref-type="bibr" rid="bib1.bibx32" id="paren.6"/>.</p>
      <p id="d1e179">Other studies aim to improve the overall description of the Earth system by providing a better parameterization. ML algorithms can be trained on high-resolution ESM output or even on separately simulated processes to emulate resolved processes in a low-resolution simulation, for example, for gravity waves <xref ref-type="bibr" rid="bib1.bibx21" id="paren.7"/>, cloud cover parameterizations <xref ref-type="bibr" rid="bib1.bibx26" id="paren.8"/>, general parameterizations <xref ref-type="bibr" rid="bib1.bibx11" id="paren.9"/>, sub-grid-scale momentum transport <xref ref-type="bibr" rid="bib1.bibx50" id="paren.10"/>, effects of cloud-resolving simulations <xref ref-type="bibr" rid="bib1.bibx41" id="paren.11"/>, ozone distributions <xref ref-type="bibr" rid="bib1.bibx35" id="paren.12"/>, and radiative transfer <xref ref-type="bibr" rid="bib1.bibx6" id="paren.13"/>.</p>
      <p id="d1e204">Many parameterizations in ESMs, such as the convective parameterizations, can be removed at higher resolutions if the process can be completely resolved. On the other hand, some others would need to be parameterized even for 1 km scale weather models. Cloud microphysical processes fall in this category. Processes dealing with the droplet interactions that lead to precipitation are lumped together and referred to as cloud microphysical processes. Due to high particle counts even at small grid sizes and our incomplete understanding of processes that occur at a molecular level in clouds <xref ref-type="bibr" rid="bib1.bibx33" id="paren.14"/>, we cannot expect cloud microphysical parameterizations to become obsolete in the near future for high-resolution models.</p>
      <p id="d1e210">The parameterization of these processes suffers from a unique accuracy–speed trade-off. The most accurate droplet-based Lagrangian schemes such as the super-droplet method <xref ref-type="bibr" rid="bib1.bibx48" id="paren.15"/> are computationally expensive. The commonly used two-moment bulk schemes represent the complex particle size distributions as only the first two moments, referring to the total droplet concentration and the total water content of the hydrometeors. For modeling the droplet collisions in a warm-rain scenario, ICON uses the well-studied two-moment bulk scheme developed in <xref ref-type="bibr" rid="bib1.bibx45" id="text.16"/>. To bridge this gap and to make the use of more complex microphysical schemes feasible within operational models, a data-driven approach can be employed. Here, we present the integration of SuperdropNet <xref ref-type="bibr" rid="bib1.bibx47" id="paren.17"/>, an ML algorithm for emulating warm-rain processes in cloud microphysics, into ICON v2.6.5. SuperdropNet is trained on zero-dimensional box model super-droplet simulations from McSnow v1.1.0 <xref ref-type="bibr" rid="bib1.bibx9" id="paren.18"/>, a super-droplet-based cloud microphysics model, in a warm-rain scenario and replaces the warm-rain processes in the two-moment scheme available in ICON v2.6.5 <xref ref-type="bibr" rid="bib1.bibx44" id="paren.19"/>.</p>
      <p id="d1e229">Due to the statistical nature of ML algorithms and the complex nonlinear interactions in ESMs, hybrid systems of numerical ESMs and ML algorithms require careful validation and verification <xref ref-type="bibr" rid="bib1.bibx23 bib1.bibx12" id="paren.20"/>. Stand-alone ML algorithms are first trained on a dataset and then validated on a holdout test dataset that is not seen during training. This test set is within the distribution of the training data. When an ML algorithm is coupled with an ESM, it may encounter conditions outside of the range of the training data, and the required extrapolation could lead to instabilities <xref ref-type="bibr" rid="bib1.bibx51" id="paren.21"/>. Thus, the so-called <italic>offline</italic> performance of an ML algorithm is often not a good indicator of its <italic>online</italic> performance <xref ref-type="bibr" rid="bib1.bibx14 bib1.bibx40" id="paren.22"/>. Stability is a major concern when introducing ML emulators into ESMs. It can be improved by adapting the training procedure <xref ref-type="bibr" rid="bib1.bibx39 bib1.bibx13 bib1.bibx40 bib1.bibx11" id="paren.23"/> or by fulfilling physical constraints in the network architecture <xref ref-type="bibr" rid="bib1.bibx7 bib1.bibx51" id="paren.24"/>. Careful validation setups can help the scientific community to build trust in so-called black box ML algorithms <xref ref-type="bibr" rid="bib1.bibx31" id="paren.25"/>.</p>
      <p id="d1e257">To avoid devoting resources to the development of ML algorithms that fail in contact with reality, we encourage incorporating online testing at an early stage. ML algorithms are developed iteratively, and new versions should be tested quickly in their final place of application in the Earth system model.</p>
      <p id="d1e260">The popular software libraries for ML algorithm development, such as PyTorch <xref ref-type="bibr" rid="bib1.bibx38" id="paren.26"/>, Keras <xref ref-type="bibr" rid="bib1.bibx18" id="paren.27"/>, or TensorFlow <xref ref-type="bibr" rid="bib1.bibx1" id="paren.28"/>, are based on the Python language. On the other hand, ICON is written in Fortran. Online testing requires either rewriting the ML emulator in Fortran or integrating the two programming languages with one another <xref ref-type="bibr" rid="bib1.bibx12" id="paren.29"/>. Since ML algorithm development is an iterative process, frequent rewrites of the ML algorithm would be required in the former case. In order to save developer resources, we recommend coupling Python and Fortran at least during the stage of algorithm development.</p>
      <p id="d1e275">In Sect. <xref ref-type="sec" rid="Ch1.S2.SS1"/>, we introduce the warm-bubble scenario, which serves as a test case for SuperdropNet. The ML algorithm itself is described in Sect. <xref ref-type="sec" rid="Ch1.S2.SS3"/>. Different strategies for integrating SuperdropNet into ICON are discussed in Sect. <xref ref-type="sec" rid="Ch1.S3"/>. The results and the impact of SuperdropNet on atmospheric processes and prognostic variables are presented in Sect. <xref ref-type="sec" rid="Ch1.S4.SS2"/>. A computational and qualitative benchmark of three different strategies is included in Sect. <xref ref-type="sec" rid="Ch1.S4.SS3"/>.</p>
</sec>
<sec id="Ch1.S2">
  <label>2</label><title>Methods</title>
<sec id="Ch1.S2.SS1">
  <label>2.1</label><title>Warm-bubble scenario</title>
      <?pagebreak page4019?><p id="d1e303">We validate SuperdropNet in the warm-bubble scenario, a test case for cloud microphysics available in ICON v2.6.5. It describes an atmosphere temperature profile with a warm air bubble at the bottom that rises vertically. The test case operates on a torus grid. This grid is created by a domain of <inline-formula><mml:math id="M1" display="inline"><mml:mrow><mml:mn mathvariant="normal">22</mml:mn><mml:mo>×</mml:mo><mml:mn mathvariant="normal">20</mml:mn></mml:mrow></mml:math></inline-formula> cells where periodic boundary conditions are applied in <inline-formula><mml:math id="M2" display="inline"><mml:mi>x</mml:mi></mml:math></inline-formula> and <inline-formula><mml:math id="M3" display="inline"><mml:mi>y</mml:mi></mml:math></inline-formula> directions. The horizontal resolution is 5 km, and there are 70 vertical levels in the <inline-formula><mml:math id="M4" display="inline"><mml:mi>z</mml:mi></mml:math></inline-formula> direction. The simulation time step is 20 s, with a total simulation time of 120 min. The experiment is computationally lightweight and runs on a single compute node. We test SuperdropNet in a warm atmosphere with no ice particle formation, as well as in a mixed-phase and a cold atmosphere that both allow for ice formation. All simulation parameters are summarized in Table <xref ref-type="table" rid="Ch1.T1"/>. We transport the tracers required for two-moment cloud microphysics (i.e., the first and second moment of the hydrometeor cloud water, cloud ice, rain, snow, graupel, and hail).</p>

<?xmltex \floatpos{t}?><table-wrap id="Ch1.T1" specific-use="star"><?xmltex \currentcnt{1}?><label>Table 1</label><caption><p id="d1e344">Experiment parameters for the warm-bubble, mixed-phase bubble, and cold-bubble test cases. Note that <inline-formula><mml:math id="M5" display="inline"><mml:mrow><mml:msub><mml:mi>t</mml:mi><mml:mi mathvariant="normal">dyn</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula> and <inline-formula><mml:math id="M6" display="inline"><mml:mrow><mml:msub><mml:mi>t</mml:mi><mml:mrow><mml:mn mathvariant="normal">2</mml:mn><mml:mi mathvariant="normal">mom</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:math></inline-formula> reflect the time step used for training SuperdropNet.</p></caption><oasis:table frame="topbot"><oasis:tgroup cols="5">
     <oasis:colspec colnum="1" colname="col1" align="left"/>
     <oasis:colspec colnum="2" colname="col2" align="left"/>
     <oasis:colspec colnum="3" colname="col3" align="left" colsep="1"/>
     <oasis:colspec colnum="4" colname="col4" align="center" colsep="1"/>
     <oasis:colspec colnum="5" colname="col5" align="left"/>
     <oasis:thead>
       <oasis:row rowsep="1">
         <oasis:entry colname="col1">Parameter</oasis:entry>
         <oasis:entry colname="col2">Description</oasis:entry>
         <oasis:entry colname="col3">Warm bubble</oasis:entry>
         <oasis:entry colname="col4">Mixed-phase bubble</oasis:entry>
         <oasis:entry colname="col5">Cold bubble</oasis:entry>
       </oasis:row>
     </oasis:thead>
     <oasis:tbody>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M7" display="inline"><mml:mrow><mml:msub><mml:mi>L</mml:mi><mml:mi mathvariant="normal">D</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Torus domain length</oasis:entry>
         <oasis:entry namest="col3" nameend="col5" align="center" colsep="0">5000 m </oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M8" display="inline"><mml:mrow><mml:msub><mml:mi>t</mml:mi><mml:mi mathvariant="normal">dyn</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Dynamical time step</oasis:entry>
         <oasis:entry namest="col3" nameend="col5" align="center" colsep="0">20 s </oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M9" display="inline"><mml:mrow><mml:msub><mml:mi>t</mml:mi><mml:mrow><mml:mn mathvariant="normal">2</mml:mn><mml:mi mathvariant="normal">mom</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Two-moment scheme time step</oasis:entry>
         <oasis:entry namest="col3" nameend="col5" align="center" colsep="0">20 s </oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M10" display="inline"><mml:mrow><mml:msub><mml:mi>z</mml:mi><mml:mi mathvariant="normal">lev</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Atmospheric levels</oasis:entry>
         <oasis:entry namest="col3" nameend="col5" align="center" colsep="0"><inline-formula><mml:math id="M11" display="inline"><mml:mn mathvariant="normal">70</mml:mn></mml:math></inline-formula></oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M12" display="inline"><mml:mrow><mml:msub><mml:mi>p</mml:mi><mml:mi mathvariant="normal">srfc</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Surface pressure</oasis:entry>
         <oasis:entry namest="col3" nameend="col5" align="center" colsep="0">1013.25 hPa </oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M13" display="inline"><mml:mrow><mml:msub><mml:mi>T</mml:mi><mml:mn mathvariant="normal">0</mml:mn></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Cold point of the atmosphere</oasis:entry>
         <oasis:entry colname="col3">303.15 K</oasis:entry>
         <oasis:entry colname="col4">273.15 K</oasis:entry>
         <oasis:entry colname="col5">268.15 K</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M14" display="inline"><mml:mrow><mml:msub><mml:mi mathvariant="italic">γ</mml:mi><mml:mn mathvariant="normal">0</mml:mn></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Vertical temperature lapse rate</oasis:entry>
         <oasis:entry namest="col3" nameend="col4" align="center" colsep="1">0.006 K m<inline-formula><mml:math id="M15" display="inline"><mml:msup><mml:mi/><mml:mrow><mml:mo>-</mml:mo><mml:mn mathvariant="normal">1</mml:mn></mml:mrow></mml:msup></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col5">0.009 K m<inline-formula><mml:math id="M16" display="inline"><mml:msup><mml:mi/><mml:mrow><mml:mo>-</mml:mo><mml:mn mathvariant="normal">1</mml:mn></mml:mrow></mml:msup></mml:math></inline-formula></oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M17" display="inline"><mml:mrow><mml:msub><mml:mi>z</mml:mi><mml:mn mathvariant="normal">0</mml:mn></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Altitude up to which <inline-formula><mml:math id="M18" display="inline"><mml:mrow><mml:msub><mml:mi mathvariant="italic">γ</mml:mi><mml:mn mathvariant="normal">0</mml:mn></mml:msub></mml:mrow></mml:math></inline-formula> applies</oasis:entry>
         <oasis:entry namest="col3" nameend="col4" align="center" colsep="1">3000 m </oasis:entry>
         <oasis:entry colname="col5">4000 m</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M19" display="inline"><mml:mrow><mml:msub><mml:mi mathvariant="italic">γ</mml:mi><mml:mn mathvariant="normal">1</mml:mn></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Lapse rate above <inline-formula><mml:math id="M20" display="inline"><mml:mrow><mml:msub><mml:mi>z</mml:mi><mml:mn mathvariant="normal">0</mml:mn></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry namest="col3" nameend="col4" align="center" colsep="1">0.00001 K m<inline-formula><mml:math id="M21" display="inline"><mml:msup><mml:mi/><mml:mrow><mml:mo>-</mml:mo><mml:mn mathvariant="normal">1</mml:mn></mml:mrow></mml:msup></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col5">0.0001 K m<inline-formula><mml:math id="M22" display="inline"><mml:msup><mml:mi/><mml:mrow><mml:mo>-</mml:mo><mml:mn mathvariant="normal">1</mml:mn></mml:mrow></mml:msup></mml:math></inline-formula></oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M23" display="inline"><mml:mrow><mml:msub><mml:mi>T</mml:mi><mml:mi mathvariant="normal">perturb</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Temperature perturbation</oasis:entry>
         <oasis:entry namest="col3" nameend="col4" align="center" colsep="1">10 K </oasis:entry>
         <oasis:entry colname="col5">5 K</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M24" display="inline"><mml:mrow><mml:msub><mml:mi mathvariant="italic">ϕ</mml:mi><mml:mi mathvariant="normal">bg</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Background relative humidity</oasis:entry>
         <oasis:entry namest="col3" nameend="col5" align="center" colsep="0">0.7  </oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M25" display="inline"><mml:mrow><mml:msub><mml:mi mathvariant="italic">ϕ</mml:mi><mml:mi mathvariant="normal">mx</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Maximum relative humidity</oasis:entry>
         <oasis:entry namest="col3" nameend="col4" align="center" colsep="1">0.9  </oasis:entry>
         <oasis:entry colname="col5">0.95</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M26" display="inline"><mml:mi mathvariant="italic">ξ</mml:mi></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Half-width of temperature perturbation in <inline-formula><mml:math id="M27" display="inline"><mml:mi>x</mml:mi></mml:math></inline-formula></oasis:entry>
         <oasis:entry namest="col3" nameend="col5" align="center" colsep="0">12 500 m </oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M28" display="inline"><mml:mi mathvariant="italic">ζ</mml:mi></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Half-width of temperature perturbation in <inline-formula><mml:math id="M29" display="inline"><mml:mi>z</mml:mi></mml:math></inline-formula></oasis:entry>
         <oasis:entry namest="col3" nameend="col4" align="center" colsep="1">200 m </oasis:entry>
         <oasis:entry colname="col5">250 m</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M30" display="inline"><mml:mrow><mml:msub><mml:mi>x</mml:mi><mml:mn mathvariant="normal">0</mml:mn></mml:msub></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col2">Center of temperature perturbation in <inline-formula><mml:math id="M31" display="inline"><mml:mi>x</mml:mi></mml:math></inline-formula></oasis:entry>
         <oasis:entry namest="col3" nameend="col5" align="center" colsep="0">0 m </oasis:entry>
       </oasis:row>
     </oasis:tbody>
   </oasis:tgroup></oasis:table><?xmltex \gdef\@currentlabel{1}?></table-wrap>

</sec>
<sec id="Ch1.S2.SS2">
  <label>2.2</label><title>Two-moment bulk scheme for cloud microphysics</title>
      <p id="d1e844">In our test case, a two-moment bulk scheme is employed to compute the number concentration and total mass for all hydrometeors involved. In ICON, the two-moment bulk scheme used for warm-rain cloud microphysics is based on <xref ref-type="bibr" rid="bib1.bibx45" id="text.30"/>. To account for collision–coalescence, the number concentration and total mass for both cloud and rain are determined by calculating the rates of collision–coalescence processes, including autoconversion, accretion, and self-collection. Here, autoconversion refers to the process by which cloud droplets coalesce to form rain droplets, while accretion accounts for collisions between rain and cloud droplets. Self-collection rates for cloud and rain droplets account for collisions that do not convert cloud droplets into rain. These process rates rely solely on the droplets themselves and are subsequently utilized to update the bulk moments for the following time step using a set of ordinary differential equations.</p>
</sec>
<sec id="Ch1.S2.SS3">
  <label>2.3</label><title>SuperdropNet cloud microphysics model</title>
      <p id="d1e858">SuperdropNet is a machine learning emulator for super-droplet simulations in a warm-rain scenario. It is a neural network consisting of fully connected layers and is trained to predict updates of the bulk moments for cloud and rain over different droplet size distributions. SuperdropNet is detailed in <xref ref-type="bibr" rid="bib1.bibx47" id="text.31"/>; therefore, we will provide only a brief summary of the training procedure here.</p>
      <p id="d1e864">The super-droplet simulations used for training are generated with McSnow <xref ref-type="bibr" rid="bib1.bibx9" id="paren.32"/>. In <xref ref-type="bibr" rid="bib1.bibx9" id="text.33"/>, McSnow is used for simulating ice particles, while in <xref ref-type="bibr" rid="bib1.bibx46" id="text.34"/>, it simulated a warm-rain scenario. Similarly to <xref ref-type="bibr" rid="bib1.bibx46" id="text.35"/>, the training data for SuperdropNet are generated in a warm-rain scenario that describes only the conversion of cloud droplets into rain in a dimensionless control volume. As super-droplet simulations are stochastic in nature, we use multiple realizations of simulations to train SuperdropNet. Hence, given a set of initial conditions, SuperdropNet is completely deterministic in nature, and the bulk moments estimated by it are the equivalent of averaged super-droplet simulations <xref ref-type="bibr" rid="bib1.bibx47" id="paren.36"/>.  The microphysical processes accounted for are accretion, autoconversion, and self-collection of rain and cloud droplets. In ICON, the droplet collisions corresponding to warm-rain processes are treated in a separate module where the process rates for accretion, autoconversion, and self-collection of rain and cloud droplets are calculated. The parameterization scheme is localized; i.e., the process rates calculated for a grid cell depend only on the rain and cloud moments which correspond to that grid cell. Other microphysical processes and the vertical transport are accounted for in separate modules, which implies that the parameterization in ICON is structured in such a way that all individual grid points can be considered zero-dimensional boxes. Thus, the parameterization setup for droplet collisions in ICON mimics the training data for SuperdropNet. This justifies the choice of using a test scenario in ICON for the online coupling and testing of SuperdropNet.</p>
      <p id="d1e882">Note that only the warm-rain processes are replaced with SuperdropNet. In a cold atmosphere, SuperdropNet can still be coupled with ICON, but since warm-rain processes are not relevant there, including SuperdropNet is expected not to change the experiment results.</p>
</sec>
<sec id="Ch1.S2.SS4">
  <label>2.4</label><title>ICON program flow</title>
      <p id="d1e893">To illustrate at which point of program execution ML ESM coupling becomes necessary, we show the flowchart for a single ICON time step in Fig. <xref ref-type="fig" rid="Ch1.F1"/>, focusing only on the steps relevant to our application. Starting from the general ICON time loop, where the full grid information is available, we enter the cloud microphysics parameterization. At this point, a given thread has access to one block of grid cells with block length <inline-formula><mml:math id="M32" display="inline"><mml:mi mathvariant="normal">nproma</mml:mi></mml:math></inline-formula>, and all threads work in parallel. The two-moment scheme has its own grid representation, called <italic>ik</italic> slices, where the block of grid cells is again divided by atmospheric levels. In our experiment, we simply replace the warm-rain processes with a call to SuperdropNet, which provides updated moments for cloud and water droplets.</p>

      <?xmltex \floatpos{t}?><fig id="Ch1.F1"><?xmltex \currentcnt{1}?><?xmltex \def\figurename{Figure}?><label>Figure 1</label><caption><p id="d1e910">We replace the warm-rain processes (gray) with a call to SuperdropNet (orange). At this point, each thread has access to an <italic>ik</italic> slice, a specific representation in the cloud microphysics parameterization that corresponds to one atmospheric level for one block of grid cells.</p></caption>
          <?xmltex \igopts{width=236.157874pt}?><graphic xlink:href="https://gmd.copernicus.org/articles/17/4017/2024/gmd-17-4017-2024-f01.png"/>

        </fig>

      <?pagebreak page4020?><p id="d1e922">Since the call to the ML component is not at the grid level but operates on <italic>ik</italic> slices far down in the nested structure of the ICON program flow, we need to call SuperdropNet several times per time step – once for each block of grid cells and once for each atmospheric level. Note that saturation adjustments and evaporation are handled outside of the parts of the ICON code replaced by SuperdropNet.</p>
</sec>
</sec>
<sec id="Ch1.S3">
  <label>3</label><title>Integrating SuperdropNet in ICON</title>
      <p id="d1e937">There are several ways to integrate Python machine learning components into Fortran code <xref ref-type="bibr" rid="bib1.bibx37" id="paren.37"/>. Based on a pre-selection of suitable methods, we have implemented three strategies, so-called Fortran–Python bridges. For convenience, we add a name list to ICON that allows the selection of the coupling strategy. We perform the experiment with all three methods on the German Climate Computing Center (DKRZ) Levante system. Levante is a BullSequana XH2000 supercomputer with 3042 compute nodes using the third generation of AMD EPYC CPUs (Milan) with 128 cores per node, NVIDIA A100 GPUs, and a 130-petabyte DDN filesystem. The nodes are connected to a Mellanox InfiniBand HDR100 fabric.</p>
<sec id="Ch1.S3.SS1">
  <label>3.1</label><title>Embedding Python as a dynamic library</title>
      <p id="d1e950">Using the techniques in <xref ref-type="bibr" rid="bib1.bibx10" id="text.38"/>, we develop a dynamic library based on Python code. The library is generated using the C foreign function interface (CFFI) <xref ref-type="bibr" rid="bib1.bibx42" id="paren.39"/> and is linked to ICON at compile time. At runtime, Python code is executed from the library. Employment of CFFI results in Python and Fortran sharing their address space and hence passing memory pointers is sufficient for accessing the same data. Jobs are run in a homogeneous setting, with Python code executed on the same CPU compute node as ICON.</p>
</sec>
<sec id="Ch1.S3.SS2">
  <label>3.2</label><title>Using the coupling software YAC</title>
      <p id="d1e967">Yet Another Coupler (YAC) <xref ref-type="bibr" rid="bib1.bibx28 bib1.bibx27" id="paren.40"/> is commonly used to couple different ICON components, e.g., atmosphere, ocean, and I/O. YAC provides Python bindings so that external Python programs can be coupled with little effort with ICON.</p>
      <p id="d1e973">YAC requires a definition of fields that are to be exchanged and an exchange schedule that cannot be below the time step of ICON. For the warm-bubble scenario, we set the block length to the number of grid cells <inline-formula><mml:math id="M33" display="inline"><mml:mrow><mml:mo>(</mml:mo><mml:mn mathvariant="normal">880</mml:mn><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula> and define two exchange fields per atmospheric level, one for the ICON-to-Python exchange and one for the reverse exchange. This yields a total of 140 fields that are exchanged at each time step. A smaller block length would require the developer to define more exchange fields such that bulk moments in each grid cell can be exchanged at every time step.</p>
      <p id="d1e988">Data transfer is building on Message Passing Interface (MPI) routines that are integrated in YAC. This offers the flexibility to use heterogeneous jobs – i.e., running ICON on CPU nodes and ML inference on GPU nodes. Due to the current limitations of the scheduling software employed in the DKRZ Levante system, it was not possible to schedule simulations that span the CPU and the GPU partition of the system. Thus, we were not able to test the performance in a heterogeneous setting. With ICON shifting to GPUs, we foresee that in the future homogeneous jobs will be run on GPU nodes.</p><?xmltex \hack{\newpage}?>
</sec>
<?pagebreak page4021?><sec id="Ch1.S3.SS3">
  <label>3.3</label><title>Pipes</title>
      <p id="d1e1000">We implemented a coupling between <inline-formula><mml:math id="M34" display="inline"><mml:mi>n</mml:mi></mml:math></inline-formula> ICON processes and 1 Python process running on the same node using FIFO (first-in–first-out) pipes. The first ICON MPI rank on the node will spawn a separate Python process that runs a worker script. Each rank also creates two pipes, one for each direction of communication (input and output to the Python worker). The worker iterates over all input pipes, performs the warm-rain calculation on data that are available, and writes results back to the corresponding ICON process via its output pipe.</p>
      <p id="d1e1010">While this solution does not incur the potential overhead of using MPI to communicate locally, it is not a full shared-memory solution exclusively relying on pointers. The corresponding extensions to ICON and the Python worker script are optimized to do as few memory copies as possible, though naturally some copying cannot be avoided when interacting with the pipes. As FIFO pipes only work on a local node, no cross-node setups are possible, such as running ICON and Python on different types of nodes (CPU, GPU). As the Python worker runs as a separate process on a dedicated core, the number of cores available to ICON is also marginally reduced by one.</p>
</sec>
<sec id="Ch1.S3.SS4">
  <label>3.4</label><title>Other methods</title>
      <p id="d1e1022">We note that the selection of methods in Sect. <xref ref-type="sec" rid="Ch1.S3.SS1"/>–<xref ref-type="sec" rid="Ch1.S3.SS3"/> by no means encompasses all the available tools, and here we summarize the alternatives to the best of our knowledge.</p>
      <p id="d1e1029">Four software libraries developed at the European Centre for Medium-Range Weather Forecasts (ECMWF; <xref ref-type="bibr" rid="bib1.bibx8" id="altparen.41"/>), the Cambridge Institute for Computing in Climate Science <xref ref-type="bibr" rid="bib1.bibx24" id="paren.42"/>, NVIDIA <xref ref-type="bibr" rid="bib1.bibx2" id="paren.43"/>, and Tongji University <xref ref-type="bibr" rid="bib1.bibx34" id="paren.44"/> address ML inference directly by exposing the TensorFlow and PyTorch APIs for Fortran, respectively. This adds the benefit of not requiring a Python runtime environment at the time of execution. Since we require flexibility to use Python code beyond ML inference and data exchange is done here via RAM comparable to the approach described in Sect. <xref ref-type="sec" rid="Ch1.S3.SS1"/>, we did not investigate these libraries further.</p>
      <p id="d1e1046">During development, we noted that integrating SmartSim <xref ref-type="bibr" rid="bib1.bibx37" id="paren.45"/> would require a rewrite of the ICON startup routine that is beyond the scope of this project. On a similar note, the coupling routines developed for the open-source Weather Research and Forecasting (WRF) model, WRF–ML, cannot easily be adjusted to work with ICON <xref ref-type="bibr" rid="bib1.bibx52" id="paren.46"/>.</p>
      <p id="d1e1055">The Fortran–Keras bridge <xref ref-type="bibr" rid="bib1.bibx36" id="paren.47"/> allows for ML inference in Fortran based on ML algorithms developed in the Keras framework. This limits flexibility, since only those network layers and functionalities supported by the library can be used. On a similar note, the implementation of the ML algorithm in neural-fortran <xref ref-type="bibr" rid="bib1.bibx20" id="paren.48"/> is contingent on the library, and the Fortran Inference-Engine <xref ref-type="bibr" rid="bib1.bibx43" id="paren.49"/> is restricted to feed-forward neural networks. We chose to forego these methods since we desire the flexibility to use any novel PyTorch development without depending on their integration into an external library.</p>
</sec>
</sec>
<sec id="Ch1.S4">
  <label>4</label><title>Results</title>
<sec id="Ch1.S4.SS1">
  <label>4.1</label><title>Experiment description</title>
      <p id="d1e1083">Using the three coupling techniques described in Sect. <xref ref-type="sec" rid="Ch1.S3.SS1"/>–<xref ref-type="sec" rid="Ch1.S3.SS3"/>, we integrate SuperdropNet in ICON. The experiment results are the same since the same network is called, but the impact on computational performance is different. We run the warm-bubble scenario and the cold-bubble scenario, both with a representation of warm-rain processes using SuperdropNet and the existing two-moment bulk scheme in the two-moment cloud microphysics module.</p>
      <p id="d1e1090">We compare the effect of replacing warm-rain processes with SuperdropNet with the experiment outcome in Sect. <xref ref-type="sec" rid="Ch1.S4.SS2"/>. In Sect. <xref ref-type="sec" rid="Ch1.S4.SS3"/>, we compare the impact on computational performance that is incurred by integrating SuperdropNet for all three coupling techniques.</p>
</sec>
<sec id="Ch1.S4.SS2">
  <label>4.2</label><title>Comparison of the two-moment bulk scheme and SuperdropNet</title>
<sec id="Ch1.S4.SS2.SSS1">
  <label>4.2.1</label><title>Rain rates</title>
      <p id="d1e1112">Figure <xref ref-type="fig" rid="Ch1.F2"/>a shows the grid-averaged rain rate in the warm-bubble scenario which is derived from warm-rain processes using ICON's two-moment bulk cloud microphysics, with a comparison to SuperdropNet microphysics. Since SuperdropNet was trained on particle-based simulations that avoid certain statistical approximations of two-moment bulk schemes, we do not expect the rain rates in both scenarios to match. Due to the experimental setup, it is not possible to identify with certainty which model produces the more accurate rain rates. We do note, however, that SuperdropNet yields physically plausible rain rates. The rain rate obtained using SuperdropNet evolves in a predictable way; that is, there is no rain at the beginning of the simulation, and then it eventually builds up to a peak and then slowly rescinds. At the end of the simulation, the rain rate is 0 for both simulations. No negative values are observed, and the coupling with SuperdropNet does not result in significant divergence from the simulation. This emphasizes that SuperdropNet is stable over longer simulation runs and overall behaves like a realistic ML-based emulator for droplet collisions. One of the key differences in the evolution of the rain rate with the two different parameterizations is that the onset of rain is slightly delayed in the case of SuperdropNet coupling, which indicates a slower conversion of cloud droplets to rain droplets.</p>

      <?xmltex \floatpos{t}?><fig id="Ch1.F2" specific-use="star"><?xmltex \currentcnt{2}?><?xmltex \def\figurename{Figure}?><label>Figure 2</label><caption><p id="d1e1119">Grid-averaged quantities for the two-moment bulk scheme and SuperdropNet for the <bold>(a)</bold> warm-bubble scenario, <bold>(b)</bold> cold-bubble scenario, and <bold>(c)</bold> mixed-phase scenario.</p></caption>
            <?xmltex \igopts{width=341.433071pt}?><graphic xlink:href="https://gmd.copernicus.org/articles/17/4017/2024/gmd-17-4017-2024-f02.png"/>

          </fig>

      <p id="d1e1137">As a sanity check, we perform the cold-bubble experiment using both the two-moment bulk scheme and SuperdropNet for the warm-rain processes. In this scenario,<?pagebreak page4022?> warm-rain processes are not relevant for the cloud microphysics, and we expect that including SuperdropNet does not affect processes with frozen particles. Figure <xref ref-type="fig" rid="Ch1.F2"/>b shows the grid-averaged snow rate.</p>
      <p id="d1e1143">Both schemes show identical snow rates, which confirms that there are no undesired side effects from coupling SuperdropNet when the conditions in the atmosphere do not allow for warm-rain processes.</p>
      <p id="d1e1146">We also perform a mixed-phase experiment with the same setup. In this scenario, both frozen and non-frozen particles occur in the atmosphere. Figure <xref ref-type="fig" rid="Ch1.F2"/>c shows the grid-averaged rain rate. The grid-averaged values for all hydrometeors are included in the Appendix. In this case, coupling to SuperdropNet significantly drops the total rain rate. Since the total water mass remains conserved in ICON, the suppression of rain formation leads to increased ice, cloud, and snow formation (Fig. <xref ref-type="fig" rid="App1.Ch1.S1.F7"/>). In ICON, the warm-rain processes are simulated before other processes such as ice nucleation, ice self-collection, and snow melting. Hence, SuperdropNet's effect on decreasing rain formation is subsequently reflected in the excess of other hydrometeors.</p>
</sec>
<sec id="Ch1.S4.SS2.SSS2">
  <label>4.2.2</label><title>Heat transport fluxes</title>
      <p id="d1e1161">Figure <xref ref-type="fig" rid="Ch1.F3"/> shows the grid-averaged evaporative fluxes as it evolves with time during the coupled warm-bubble simulation in ICON. While in the beginning both the two-moment bulk scheme and SuperdropNet produce similar fluxes, the values diverge approximately after 30 min, which corresponds to the onset of rain. This difference between the magnitude of fluxes is also reflected in the evolution of winds during the simulation. Winds are the primary source of energy transport, and Fig. <xref ref-type="fig" rid="Ch1.F4"/> shows the evolution of meridional winds in the simulation. After approximately 40 min, which roughly corresponds to the end of the first rainfall with both parameterizations, the wind patterns are markedly different for the two-moment bulk and the SuperdropNet parameterizations. The winds appear much stronger in case of the two-moment bulk parameterization across the vertical column. The reduced magnitude of winds in SuperdropNet coupling corresponds to reduced heat fluxes in Fig. <xref ref-type="fig" rid="Ch1.F3"/>.</p>
      <p id="d1e1170">Figure <xref ref-type="fig" rid="Ch1.F5"/> shows the vertical profile of specific humidity at different time steps during the simulation. For the first 40 min of the experiment, both parameterization schemes produce similar specific humidity profiles, but this changes during the later part of the simulation. Close to the surface, it can be observed that the two-moment bulk parameterization produces a stronger humidity gradient in comparison to SuperdropNet. This difference in the specific humidity gradient possibly results in a higher evaporative flux for the two-moment bulk coupling than the SuperdropNet coupled simulation.</p>

      <?xmltex \floatpos{t}?><fig id="Ch1.F3"><?xmltex \currentcnt{3}?><?xmltex \def\figurename{Figure}?><label>Figure 3</label><caption><p id="d1e1177">Grid-averaged evaporative heat fluxes for the two-moment bulk scheme used in ICON two-moment cloud microphysics and for SuperdropNet. The gray area shows the grid-averaged rain obtained using the two-moment bulk scheme (see Fig. <xref ref-type="fig" rid="Ch1.F2"/>a). High negative values indicate a larger amount of heat transfer.</p></caption>
            <?xmltex \igopts{width=236.157874pt}?><graphic xlink:href="https://gmd.copernicus.org/articles/17/4017/2024/gmd-17-4017-2024-f03.png"/>

          </fig>

      <?xmltex \floatpos{t}?><fig id="Ch1.F4" specific-use="star"><?xmltex \currentcnt{4}?><?xmltex \def\figurename{Figure}?><label>Figure 4</label><caption><p id="d1e1191">Averaged meridional winds for the two-moment bulk scheme used in ICON two-moment cloud microphysics <bold>(a)</bold> and for SuperdropNet <bold>(b)</bold>.</p></caption>
            <?xmltex \igopts{width=341.433071pt}?><graphic xlink:href="https://gmd.copernicus.org/articles/17/4017/2024/gmd-17-4017-2024-f04.png"/>

          </fig>

      <?xmltex \floatpos{t}?><fig id="Ch1.F5" specific-use="star"><?xmltex \currentcnt{5}?><?xmltex \def\figurename{Figure}?><label>Figure 5</label><caption><p id="d1e1208">Vertical profile of the specific humidity at different times for the two-moment bulk scheme and for SuperdropNet.</p></caption>
            <?xmltex \igopts{width=341.433071pt}?><graphic xlink:href="https://gmd.copernicus.org/articles/17/4017/2024/gmd-17-4017-2024-f05.png"/>

          </fig>

      <?xmltex \floatpos{t}?><fig id="Ch1.F6" specific-use="star"><?xmltex \currentcnt{6}?><?xmltex \def\figurename{Figure}?><label>Figure 6</label><caption><p id="d1e1219">Vertical profile of the rain droplet mass, calculated as the ratio of the specific rain content and the number concentration of rain droplets at different times for the two-moment bulk scheme and for SuperdropNet.</p></caption>
            <?xmltex \igopts{width=341.433071pt}?><graphic xlink:href="https://gmd.copernicus.org/articles/17/4017/2024/gmd-17-4017-2024-f06.png"/>

          </fig>

      <?pagebreak page4023?><p id="d1e1228">Similarly, in Fig. <xref ref-type="fig" rid="Ch1.F6"/>, the evolution of mean rain droplet mass (<inline-formula><mml:math id="M35" display="inline"><mml:mrow><mml:msub><mml:mover accent="true"><mml:mi>X</mml:mi><mml:mo mathvariant="normal">‾</mml:mo></mml:mover><mml:mi mathvariant="normal">r</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula>) is shown. The differences in <inline-formula><mml:math id="M36" display="inline"><mml:mrow><mml:msub><mml:mover accent="true"><mml:mi>X</mml:mi><mml:mo mathvariant="normal">‾</mml:mo></mml:mover><mml:mi mathvariant="normal">r</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula> close to the surface, as calculated using the two-moment bulk scheme compared to SuperdropNet, become more visible after 40 min. In general, with the two-moment bulk parameterization <inline-formula><mml:math id="M37" display="inline"><mml:mrow><mml:msub><mml:mover accent="true"><mml:mi>X</mml:mi><mml:mo mathvariant="normal">‾</mml:mo></mml:mover><mml:mi mathvariant="normal">r</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula>, values are higher than those with the super-droplet parameterization close to the surface. Since the evaporative flux is proportional to the mean rain mass, higher <inline-formula><mml:math id="M38" display="inline"><mml:mrow><mml:msub><mml:mover accent="true"><mml:mi>X</mml:mi><mml:mo mathvariant="normal">‾</mml:mo></mml:mover><mml:mi mathvariant="normal">r</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula> in two-moment bulk coupling results in higher heat fluxes. Throughout the vertical column, the SuperdropNet parameterization usually corresponds to lower <inline-formula><mml:math id="M39" display="inline"><mml:mrow><mml:msub><mml:mover accent="true"><mml:mi>X</mml:mi><mml:mo mathvariant="normal">‾</mml:mo></mml:mover><mml:mi mathvariant="normal">r</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula>, except at the 40 min time step, where the high <inline-formula><mml:math id="M40" display="inline"><mml:mrow><mml:msub><mml:mover accent="true"><mml:mi>X</mml:mi><mml:mo mathvariant="normal">‾</mml:mo></mml:mover><mml:mi mathvariant="normal">r</mml:mi></mml:msub></mml:mrow></mml:math></inline-formula> value near the 3000 m height also corresponds to a higher amount of the vertically integrated rain rate, as seen in Fig. <xref ref-type="fig" rid="Ch1.F2"/>a.</p>
      <p id="d1e1320">Note that the warm-bubble scenario in ICON is highly sensitive to the tiniest fluctuations in the assumptions made for cloud microphysics parameterization. Since many other complex phenomena are simplified and the focus is only on the formation and dissipation of a single cloud, small deviations in the approximation of the cloud and rain moments lead to changes in other diagnostic variables that can accumulate over time.</p>
</sec>
</sec>
<sec id="Ch1.S4.SS3">
  <label>4.3</label><title>Computational performance upon including SuperdropNet</title>
<sec id="Ch1.S4.SS3.SSS1">
  <label>4.3.1</label><title>Benchmark</title>
      <p id="d1e1339">We ran the experiments on the Levante computing system at the German Climate Computing Center on compute nodes equipped with two AMD 7763 CPUs with a total of 128 cores and 256 GB of main memory. The nodes are connected with a Mellanox InfiniBand HDR100 fabric.</p>

<?xmltex \floatpos{t}?><table-wrap id="Ch1.T2"><?xmltex \currentcnt{2}?><label>Table 2</label><caption><p id="d1e1345">Time spent in the two-moment scheme in the ICON warm-bubble scenario, using the two-moment bulk scheme (Fortran), and SuperdropNet (PyTorch) coupled with ICON. Note that by coupling SuperdropNet with ICON, we introduce a scheme that would be computationally intractable for cloud microphysics in standard numerical simulations. A direct comparison of runtimes is therefore not possible.</p></caption><oasis:table frame="topbot"><?xmltex \begin{scaleboxenv}{.90}[.90]?><oasis:tgroup cols="4">
     <oasis:colspec colnum="1" colname="col1" align="left"/>
     <oasis:colspec colnum="2" colname="col2" align="left"/>
     <oasis:colspec colnum="3" colname="col3" align="right"/>
     <oasis:colspec colnum="4" colname="col4" align="right"/>
     <oasis:thead>
       <oasis:row rowsep="1">
         <oasis:entry colname="col1">Experiment</oasis:entry>
         <oasis:entry colname="col2"/>
         <oasis:entry colname="col3"><inline-formula><mml:math id="M41" display="inline"><mml:mrow><mml:msub><mml:mi>t</mml:mi><mml:mrow><mml:mn mathvariant="normal">2</mml:mn><mml:mi mathvariant="normal">mom</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:math></inline-formula> (s)</oasis:entry>
         <oasis:entry colname="col4">Nodes</oasis:entry>
       </oasis:row>
     </oasis:thead>
     <oasis:tbody>
       <oasis:row rowsep="1">
         <oasis:entry colname="col1">Two-moment bulk scheme (Fortran)</oasis:entry>
         <oasis:entry colname="col2"/>
         <oasis:entry colname="col3">1.25</oasis:entry>
         <oasis:entry colname="col4">1</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"/>
         <oasis:entry colname="col2">CFFI</oasis:entry>
         <oasis:entry colname="col3">24.1</oasis:entry>
         <oasis:entry colname="col4">1</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1">SuperdropNet (PyTorch)</oasis:entry>
         <oasis:entry colname="col2">Pipes</oasis:entry>
         <oasis:entry colname="col3">62.6</oasis:entry>
         <oasis:entry colname="col4">1</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"/>
         <oasis:entry colname="col2">YAC</oasis:entry>
         <oasis:entry colname="col3">49.5</oasis:entry>
         <oasis:entry colname="col4">2</oasis:entry>
       </oasis:row>
     </oasis:tbody>
   </oasis:tgroup><?xmltex \end{scaleboxenv}?></oasis:table><?xmltex \gdef\@currentlabel{2}?></table-wrap>

      <p id="d1e1452">SuperdropNet provides a significant speedup by emulating processes that would otherwise be computationally infeasible to include in ICON, but when adding a Python component to the existing highly optimized Fortran code, we expect an impact on computational performance. Table <xref ref-type="table" rid="Ch1.T2"/> summarizes the total time spent in the calculation of the two-moment scheme in the ICON warm-bubble scenario, using the two-moment bulk scheme and SuperdropNet coupled with ICON using three different coupling strategies. The fastest time-to-solution is provided by including SuperdropNet via embedded Python – i.e., the C foreign function interface (CFFI) (Sect. <xref ref-type="sec" rid="Ch1.S3.SS1"/>). Coupling SuperdropNet via YAC (Sect. <xref ref-type="sec" rid="Ch1.S3.SS2"/>) increases the relative runtime by a factor of 2 compared to embedded Python. Note that when coupling with YAC, the ICON and the Python main program run on two different computational nodes, which doubles the amount of computational resources required for the experiment. In the current configuration, YAC can only be used when the block length is equal to the grid size, which limits us to small experiments like the bubble scenarios. Coupling SuperdropNet and ICON using pipes is almost 3 times slower than embedded Python. On a qualitative note, implementing the coupling via pipes requires changes to core components of ICON beyond the cloud microphysics parameterization and may be an additional challenge for ML developers.</p>
      <p id="d1e1462">We note that coupling a super-droplet model directly with our test case in ICON is extremely challenging. ICON represents the warm-rain processes as bulk moments, while McSnow represents them as droplet distributions. For an ideal benchmark simulation, we would need to completely overhaul the current representation of cloud microphysics processes in ICON and represent them as super-droplets for a two-way coupling. At the time of conducting this research, ICON did not allow for the representation of cloud microphysical processes as super-droplets, mainly because doing so would be computationally expensive. This is an active area of research, but as of now, it remains a work in progress, which makes SuperdropNet a cheaper data-driven alternative to the super-droplet simulations.</p>
</sec>
<sec id="Ch1.S4.SS3.SSS2">
  <label>4.3.2</label><title>Detailed evaluation of coupling with embedded Python</title>
      <?pagebreak page4024?><p id="d1e1473">We now turn to the fastest coupling scheme, embedded Python, and investigate the contribution of the individual steps to the total runtime. By including SuperdropNet, we incur the computational cost of data exchange and of machine learning inference. Table <xref ref-type="table" rid="Ch1.T3"/> summarizes the contribution of the individual parts, measured with a block length of <inline-formula><mml:math id="M42" display="inline"><mml:mrow><mml:mi mathvariant="normal">nproma</mml:mi><mml:mo>=</mml:mo><mml:mn mathvariant="normal">44</mml:mn></mml:mrow></mml:math></inline-formula> grid cells using the ICON timer module. ICON averages the execution time across a total of <inline-formula><mml:math id="M43" display="inline"><mml:mrow><mml:mn mathvariant="normal">496</mml:mn><mml:mspace linebreak="nobreak" width="0.125em"/><mml:mn mathvariant="normal">800</mml:mn></mml:mrow></mml:math></inline-formula> calls to SuperdropNet. Most of those times can be attributed to model inference, while the actual data transfer is less significant. This could be attributed to the fact that ML inference has to be done on a CPU. On a node equipped with an NVIDIA A100 GPU, we measure an inference time of 267 <inline-formula><mml:math id="M44" display="inline"><mml:mrow class="unit"><mml:mi mathvariant="normal">µ</mml:mi></mml:mrow></mml:math></inline-formula>s. This corresponds to 33 % of the inference time reported on a CPU (see Table <xref ref-type="table" rid="Ch1.T3"/>).</p>

<?xmltex \floatpos{t}?><table-wrap id="Ch1.T3" specific-use="star"><?xmltex \currentcnt{3}?><label>Table 3</label><caption><p id="d1e1514">Processes when coupling SuperdropNet to ICON via embedded Python and their associated duration. Machine learning inference is executed on a CPU node of the Levante computing system at the German Climate Computing Center (DKRZ).</p></caption><oasis:table frame="topbot"><oasis:tgroup cols="3">
     <oasis:colspec colnum="1" colname="col1" align="left"/>
     <oasis:colspec colnum="2" colname="col2" align="right"/>
     <oasis:colspec colnum="3" colname="col3" align="right"/>
     <oasis:thead>
       <oasis:row rowsep="1">
         <oasis:entry colname="col1">Process</oasis:entry>
         <oasis:entry colname="col2">Time (<inline-formula><mml:math id="M45" display="inline"><mml:mrow class="unit"><mml:mi mathvariant="normal">µ</mml:mi></mml:mrow></mml:math></inline-formula>s)</oasis:entry>
         <oasis:entry colname="col3">Fraction</oasis:entry>
       </oasis:row>
     </oasis:thead>
     <oasis:tbody>
       <oasis:row>
         <oasis:entry colname="col1">Time reported by ICON</oasis:entry>
         <oasis:entry colname="col2"><inline-formula><mml:math id="M46" display="inline"><mml:mrow><mml:mn mathvariant="normal">5.0</mml:mn><mml:mo>×</mml:mo><mml:msup><mml:mn mathvariant="normal">10</mml:mn><mml:mn mathvariant="normal">2</mml:mn></mml:msup></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col3">100 %</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1">Time reported by Python</oasis:entry>
         <oasis:entry colname="col2"><inline-formula><mml:math id="M47" display="inline"><mml:mrow><mml:mn mathvariant="normal">4.8</mml:mn><mml:mo>×</mml:mo><mml:msup><mml:mn mathvariant="normal">10</mml:mn><mml:mn mathvariant="normal">2</mml:mn></mml:msup></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col3">96 %</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M48" display="inline"><mml:mo>↪</mml:mo></mml:math></inline-formula> of which time reported for inference</oasis:entry>
         <oasis:entry colname="col2"><inline-formula><mml:math id="M49" display="inline"><mml:mrow><mml:mn mathvariant="normal">4.4</mml:mn><mml:mo>×</mml:mo><mml:msup><mml:mn mathvariant="normal">10</mml:mn><mml:mn mathvariant="normal">2</mml:mn></mml:msup></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col3">87 %</oasis:entry>
       </oasis:row>
       <oasis:row>
         <oasis:entry colname="col1"><inline-formula><mml:math id="M50" display="inline"><mml:mo>↪</mml:mo></mml:math></inline-formula> of which time reported for data transfer</oasis:entry>
         <oasis:entry colname="col2"><inline-formula><mml:math id="M51" display="inline"><mml:mrow><mml:mn mathvariant="normal">4.2</mml:mn><mml:mo>×</mml:mo><mml:msup><mml:mn mathvariant="normal">10</mml:mn><mml:mn mathvariant="normal">1</mml:mn></mml:msup></mml:mrow></mml:math></inline-formula></oasis:entry>
         <oasis:entry colname="col3">8.5 %</oasis:entry>
       </oasis:row>
     </oasis:tbody>
   </oasis:tgroup></oasis:table><?xmltex \gdef\@currentlabel{3}?></table-wrap>

      <p id="d1e1666">Note, however, that a heterogeneous setup, where moments are transferred to and from the GPU nodes via the Mellanox InfiniBand network, would likely lead to a larger overall wall time. Given the successful efforts to port ICON to GPU, a future experiment could be run exclusively on GPUs. By only applying SuperdropNet when at least one input moment is non-zero, we are already reducing the number of calls to the ML inference to improve performance.</p>
</sec>
</sec>
</sec>
<sec id="Ch1.S5" sec-type="conclusions">
  <label>5</label><title>Conclusions</title>
      <p id="d1e1680">We have coupled SuperdropNet, a machine learning algorithm emulating warm-rain processes in a two-moment cloud microphysics scheme, with ICON. In the warm-bubble experiment, the ML emulator is stable, and the results are physically sound.</p>
      <p id="d1e1683">The strategies to connect ICON and Python provide flexibility to the development of the ML component and account for the fact that ML development is done iteratively. Both embedded Python and YAC can be integrated, with little programming overhead, into ICON. For a later ML emulator, which replaces a full parameterization at the grid level, YAC can be used regardless of the block length. Coupling via pipes is comparatively slow and does not scale well. Since it requires an extensive rewrite of the core components of ICON, we do not recommend it for implementation. Out of the three coupling strategies we tested, embedded Python provided<?pagebreak page4025?> the fastest performance. It can be used independently of the ICON grid to execute any Python code at any level of the ICON time loop.</p>
      <p id="d1e1686">We note that by coupling SuperdropNet to ICON, we introduce a scheme that would otherwise be computationally intractable for cloud microphysics in standard numerical simulations. A direct comparison of runtimes is therefore not possible. Note, however, that integrating a Python component will slow down the overall time to solution due to the incurred cost in network inference and data transfer. For applications that are more demanding than our warm-bubble scenario test case and if the ML component is thoroughly tested, a reimplementation in Fortran would likely increase performance at the expense of losing the flexibility in development.</p>
      <p id="d1e1689">A natural extension of this work is more complex modeling scenarios. This would involve training machine-learning-based emulators for other cloud microphysical processes and/or the introduction of hydrometeors other than clouds and rain. Apart from droplet collisions, processes such as the sedimentation of droplets and deep convection can be challenging to represent with two-moment bulk parameterization schemes. Hence, in the future, we want to explore the possibility of creating ML-based proxies for these processes, while continuing to use hybrid ML ESMs for continuous online testing.</p><?xmltex \hack{\clearpage}?>
</sec>

      
      </body>
    <back><app-group>

<?pagebreak page4026?><app id="App1.Ch1.S1">
  <?xmltex \currentcnt{A}?><label>Appendix A</label><title>Evaluation of SuperdropNet</title>
<sec id="App1.Ch1.S1.SS1">
  <label>A1</label><title>Mixed-phase bubble</title>
      <p id="d1e1711">We include the grid-averaged cloud ice, cloud water, graupel, snow, and ice for the mixed-phase experiment described in Sect. <xref ref-type="sec" rid="Ch1.S4.SS2.SSS1"/>. The results are shown in Fig. <xref ref-type="fig" rid="App1.Ch1.S1.F7"/>.</p>

      <?xmltex \floatpos{h!}?><fig id="App1.Ch1.S1.F7"><?xmltex \currentcnt{A1}?><?xmltex \def\figurename{Figure}?><label>Figure A1</label><caption><p id="d1e1720">Grid-averaged quantities for the two-moment bulk scheme and SuperdropNet under <bold>(a)</bold> warm-bubble scenario, <bold>(b)</bold> cold-bubble scenario, and <bold>(c)</bold> mixed-phase scenario.</p></caption>
          <?xmltex \hack{\hsize\textwidth}?>
          <?xmltex \igopts{width=341.433071pt}?><graphic xlink:href="https://gmd.copernicus.org/articles/17/4017/2024/gmd-17-4017-2024-f07.png"/>

        </fig>

</sec>
</app>
  </app-group><notes notes-type="codedataavailability"><title>Code and data availability</title>

      <p id="d1e1745">The SuperdropNet (version 0.1.0) inference code, trained model weights, modules describing the coupling between SuperdropNet inference and generic Fortran code, analysis scripts, and Jupyter notebooks, as well as the experiment description files are available under the MIT license at <uri>https://doi.org/10.5281/zenodo.10069121</uri> <xref ref-type="bibr" rid="bib1.bibx3" id="paren.50"/>. The license file is included in the repository.</p>

      <p id="d1e1754">The ICON model code used for the simulations in this paper is available at <uri>https://doi.org/10.5281/zenodo.8348256</uri> <xref ref-type="bibr" rid="bib1.bibx4" id="paren.51"/>. It is based on the ICON release 2.6.5 and includes additional code for coupling SuperdropNet. ICON is now publicly available under the BSD-3-Clause license at <uri>https://www.icon-model.org</uri> (last access: 6 September 2023).</p>

      <p id="d1e1766">The experiment results obtained with SuperdropNet (version 0.1.0) coupled with ICON (version 2.6.5) are available at <uri>https://doi.org/10.5281/zenodo.8348266</uri> <xref ref-type="bibr" rid="bib1.bibx5" id="paren.52"/>. We used McSnow (version 1.1.0) for generating the training data in a warm-rain scenario. McSnow is not publicly available. Access to McSnow can be granted upon agreeing to the ICON licensing terms by the developers of McSnow <xref ref-type="bibr" rid="bib1.bibx9" id="paren.53"/>.</p>
  </notes><?xmltex \hack{\newpage}?><?xmltex \hack{\vspace*{12.9cm}}?><notes notes-type="authorcontribution"><title>Author contributions</title>

      <p id="d1e1783">CA developed the embedded Python and YAC coupling software, performed the experiments, provided the visualizations, and led the writing of the paper as a whole. SS developed SuperdropNet. CA and SS defined and evaluated the experiments, curated the software and data, and wrote the original draft. TW developed the pipe coupling software and contributed to the original draft. DG helped conceive the project and define the coupling task and contributed to the original draft. CA, SS, TW, and DG reviewed and edited the final paper.</p>
  </notes><notes notes-type="competinginterests"><title>Competing interests</title>

      <p id="d1e1789">The contact author has declared that none of the authors has any competing interests.</p>
  </notes><notes notes-type="disclaimer"><title>Disclaimer</title>

      <p id="d1e1795">Publisher’s note: Copernicus Publications remains neutral with regard to jurisdictional claims made in the text, published maps, institutional affiliations, or any other geographical representation in this paper. While Copernicus Publications makes every effort to include appropriate place names, the final responsibility lies with the authors.</p>
  </notes><ack><title>Acknowledgements</title><p id="d1e1801">We thank Ann Kristin Naumann and Sebastian Rast for support with the ICON warm-bubble scenario and helpful discussions, and we thank Nils-Arne Dreier and Moritz Hanke for support with integrating YAC. This work was supported by Helmholtz Association's Initiative and Networking Fund through Helmholtz AI (grant no. ZT-I-PF-5-01). This work used resources of the Deutsches Klimarechenzentrum (DKRZ) granted by its Scientific Steering Committee (WLA) under project ID AIM.</p></ack><notes notes-type="financialsupport"><title>Financial support</title>

      <p id="d1e1806">The article processing charges for this open-access publication were covered by the Helmholtz-Zentrum Hereon.</p>
  </notes><notes notes-type="reviewstatement"><title>Review statement</title>

      <p id="d1e1813">This paper was edited by Sylwester Arabas and reviewed by Paul Bowen and one anonymous referee.</p>
  </notes><ref-list>
    <title>References</title>

      <ref id="bib1.bibx1"><?xmltex \def\ref@label{{Abadi et~al.(2016)Abadi, Barham, Chen, Chen, Davis, Dean, Devin,
Ghemawat, Irving, Isard et~al.}}?><label>Abadi et al.(2016)Abadi, Barham, Chen, Chen, Davis, Dean, Devin, Ghemawat, Irving, Isard et al.</label><?label abadi16?><mixed-citation>Abadi, M., Barham, P., Chen, J., Chen, Z., Davis, A., Dean, J., Devin, M., Ghemawat, S., Irving, G., Isard, M., Jia, Y., Jozefowicz, R., Kaiser, L., Kudlur, M., Levenberg, J., Mane, D., Monga, R., Moore, S., Murray, D., Olah, C., Schuster, M., Shlens, J., Steiner, B., Sutskever, I., Talwar, K., Tucker, P., Vanhoucke, V., Vasudevan, V., Viegas, F., Vinyals, O., Warden, P., Wattenberg, M., Wicke, M., Yu, Y., and Zheng, X.: Tensorflow: A system for large-scale machine learning, in: 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 16), Savannah, GA, US, 2–4 November 2016, 265–283, <uri>https://arxiv.org/abs/1603.04467</uri> (last access: 6 September 2023), 2016.</mixed-citation></ref>
      <ref id="bib1.bibx2"><?xmltex \def\ref@label{{{Alexeev, D.}(2023)}}?><label>Alexeev, D.(2023)</label><?label alexeev_git?><mixed-citation>Alexeev, D.: PyTorch bindings for Fortran (v0.4), GitHub [code], <uri>https://github.com/alexeedm/pytorch-fortran</uri> (last access: 6 September 2023), 2023.</mixed-citation></ref>
      <ref id="bib1.bibx3"><?xmltex \def\ref@label{{Arnold et~al.(2023a)}}?><label>Arnold et al.(2023a)</label><?label ArnoldCode2023?><mixed-citation>Arnold, C., Sharma, S., and Weigel, T.:  DKRZ-AIM/dkrz-hereon-icon-superdropnet: Integrating SuperdropNet (v0.1.0), Zenodo [code], <ext-link xlink:href="https://doi.org/10.5281/zenodo.10069121" ext-link-type="DOI">10.5281/zenodo.10069121</ext-link>, 2023a.</mixed-citation></ref>
      <ref id="bib1.bibx4"><?xmltex \def\ref@label{{Arnold et~al.(2023b)}}?><label>Arnold et al.(2023b)</label><?label ArnoldCode2023b?><mixed-citation>Arnold, C., Sharma, S., and Weigel, T.: ICON Code v 2.6.5 including coupling schemes for integrating SuperdropNet, Zenodo [code], <ext-link xlink:href="https://doi.org/10.5281/zenodo.8348256" ext-link-type="DOI">10.5281/zenodo.8348256</ext-link>, 2023b.</mixed-citation></ref>
      <ref id="bib1.bibx5"><?xmltex \def\ref@label{{Arnold et~al.(2023c)}}?><label>Arnold et al.(2023c)</label><?label ArnoldCode2023c?><mixed-citation>Arnold, C., Sharma, S., and Weigel, T.: Data set for: Efficient and Stable Coupling of the SuperdropNet Deep Learning-based Cloud Microphysics (v0.1.0) to the ICON Climate and Weather Model (v2.6.5), Zenodo [data set], <ext-link xlink:href="https://doi.org/10.5281/zenodo.8348266" ext-link-type="DOI">10.5281/zenodo.8348266</ext-link>, 2023c.</mixed-citation></ref>
      <ref id="bib1.bibx6"><?xmltex \def\ref@label{{Belochitski and Krasnopolsky(2021)}}?><label>Belochitski and Krasnopolsky(2021)</label><?label belochitski21?><mixed-citation>Belochitski, A. and Krasnopolsky, V.: Robustness of neural network emulations of radiative transfer parameterizations in a state-of-the-art general circulation model, Geosci. Model Dev., 14, 7425–7437, <ext-link xlink:href="https://doi.org/10.5194/gmd-14-7425-2021" ext-link-type="DOI">10.5194/gmd-14-7425-2021</ext-link>, 2021.</mixed-citation></ref>
      <ref id="bib1.bibx7"><?xmltex \def\ref@label{{Beucler et~al.(2021)Beucler, Pritchard, Rasp, Ott, Baldi, and
Gentine}}?><label>Beucler et al.(2021)Beucler, Pritchard, Rasp, Ott, Baldi, and Gentine</label><?label beucler21?><mixed-citation>Beucler, T., Pritchard, M., Rasp, S., Ott, J., Baldi, P., and Gentine, P.: Enforcing Analytic Constraints in Neural Networks Emulating Physical Systems, Phys. Rev. Lett., 126, 098302, <ext-link xlink:href="https://doi.org/10.1103/PhysRevLett.126.098302" ext-link-type="DOI">10.1103/PhysRevLett.126.098302</ext-link>, 2021.</mixed-citation></ref>
      <ref id="bib1.bibx8"><?xmltex \def\ref@label{{Bonanni et~al.(2022)Bonanni, Hawkes, and Quintino}}?><label>Bonanni et al.(2022)Bonanni, Hawkes, and Quintino</label><?label ecmwf_infero_git?><mixed-citation>Bonanni, A., Hawkes, J., and Quintino, T.: infero: A lower-level API for Machine Learning inference in operations (version 0.2.0), GitHub [code], <uri>https://github.com/ecmwf/infero</uri> (last access: 6 September 2023), 2022.</mixed-citation></ref>
      <ref id="bib1.bibx9"><?xmltex \def\ref@label{{Brdar and Seifert(2018)}}?><label>Brdar and Seifert(2018)</label><?label brdar_mcsnow_2018?><mixed-citation>Brdar, S. and Seifert, A.: McSnow: A Monte-Carlo Particle Model for Riming and Aggregation of Ice Particles in a Multidimensional Microphysical Phase Space, J. Adv. Model. Earth Sy., 10, 187–206, <ext-link xlink:href="https://doi.org/10.1002/2017MS001167" ext-link-type="DOI">10.1002/2017MS001167</ext-link>, 2018.</mixed-citation></ref>
      <ref id="bib1.bibx10"><?xmltex \def\ref@label{{{Brenowitz}(2023)}}?><label>Brenowitz(2023)</label><?label brenowitz_git?><mixed-citation>Brenowitz, N.: Call Python from Fortran (v0.2.1), Zenodo [code], <ext-link xlink:href="https://doi.org/10.5281/zenodo.7779572" ext-link-type="DOI">10.5281/zenodo.7779572</ext-link>, 2023.</mixed-citation></ref>
      <ref id="bib1.bibx11"><?xmltex \def\ref@label{{Brenowitz and Bretherton(2018)}}?><label>Brenowitz and Bretherton(2018)</label><?label brenowitz18?><mixed-citation>Brenowitz, N. D. and Bretherton, C. S.: Prognostic Validation of a Neural Network Unified Physics Parameterization, Geophys. Res. Lett., 45, 6289–6298, <ext-link xlink:href="https://doi.org/10.1029/2018GL078510" ext-link-type="DOI">10.1029/2018GL078510</ext-link>, 2018.</mixed-citation></ref>
      <ref id="bib1.bibx12"><?xmltex \def\ref@label{{Brenowitz and Bretherton(2019)}}?><label>Brenowitz and Bretherton(2019)</label><?label brenowitz2019?><mixed-citation>Brenowitz, N. D. and Bretherton, C. S.: Spatially Extended Tests of a Neural Network Parametrization Trained by Coarse-Graining, J. Adv. Model. Earth Sy., 11, 2728–2744, <ext-link xlink:href="https://doi.org/10.1029/2019MS001711" ext-link-type="DOI">10.1029/2019MS001711</ext-link>, 2019.</mixed-citation></ref>
      <ref id="bib1.bibx13"><?xmltex \def\ref@label{{Brenowitz et~al.(2020{\natexlab{a}})Brenowitz, Beucler, Pritchard,
and Bretherton}}?><label>Brenowitz et al.(2020a)Brenowitz, Beucler, Pritchard, and Bretherton</label><?label brenowitz20?><mixed-citation>Brenowitz, N. D., Beucler, T., Pritchard, M., and Bretherton, C. S.: Interpreting and Stabilizing Machine-Learning Parametrizations of Convection, J. Atmos. Sci., 77, 4357–4375, <ext-link xlink:href="https://doi.org/10.1175/JAS-D-20-0082.1" ext-link-type="DOI">10.1175/JAS-D-20-0082.1</ext-link>, 2020a.</mixed-citation></ref>
      <ref id="bib1.bibx14"><?xmltex \def\ref@label{{Brenowitz et~al.(2020{\natexlab{b}})Brenowitz, Henn, McGibbon, Clark,
Kwa, Perkins, Watt-Meyer, and Bretherton}}?><label>Brenowitz et al.(2020b)Brenowitz, Henn, McGibbon, Clark, Kwa, Perkins, Watt-Meyer, and Bretherton</label><?label brenowitz2020arxiv?><mixed-citation>Brenowitz, N. D., Henn, B., McGibbon, J., Clark, S. K., Kwa, A., Perkins, W. A., Watt-Meyer, O., and Bretherton, C. S.: Machine Learning Climate Model Dynamics: Offline versus Online Performance, arXiv [preprint], <ext-link xlink:href="https://doi.org/10.48550/arXiv.2011.03081" ext-link-type="DOI">10.48550/arXiv.2011.03081</ext-link>, 2020b.</mixed-citation></ref>
      <ref id="bib1.bibx15"><?xmltex \def\ref@label{{Brenowitz et~al.(2022)Brenowitz, Perkins, Nugent, Watt-Meyer, Clark,
Kwa, Henn, McGibbon, and Bretherton}}?><label>Brenowitz et al.(2022)Brenowitz, Perkins, Nugent, Watt-Meyer, Clark, Kwa, Henn, McGibbon, and Bretherton</label><?label brenowitz22?><mixed-citation>Brenowitz, N. D., Perkins, W. A., Nugent, J. M., Watt-Meyer, O., Clark, S. K., Kwa, A., Henn, B., McGibbon, J., and Bretherton, C. S.: Emulating Fast Processes in Climate Models, in: Machine Learning for the Physical Sciences, NEURIPS Workshop, arXiv [preprint],  <ext-link xlink:href="https://doi.org/10.48550/arXiv.2211.10774" ext-link-type="DOI">10.48550/arXiv.2211.10774</ext-link>, 2022.</mixed-citation></ref>
      <ref id="bib1.bibx16"><?xmltex \def\ref@label{{Chantry et~al.(2021)Chantry, Hatfield, Dueben, Polichtchouk, and
Palmer}}?><label>Chantry et al.(2021)Chantry, Hatfield, Dueben, Polichtchouk, and Palmer</label><?label chantry21?><mixed-citation>Chantry, M., Hatfield, S., Dueben, P., Polichtchouk, I., and Palmer, T.: Machine learning emulation of gravity wave drag in numerical weather forecasting, J. Adv. Model. Earth Sy., 13, e2021MS002477, <ext-link xlink:href="https://doi.org/10.1029/2021MS002477" ext-link-type="DOI">10.1029/2021MS002477</ext-link>, 2021.</mixed-citation></ref>
      <ref id="bib1.bibx17"><?xmltex \def\ref@label{{Chevallier et~al.(2000)Chevallier, Morcrette, Ch{\'{e}}ruy, and
Scott}}?><label>Chevallier et al.(2000)Chevallier, Morcrette, Chéruy, and Scott</label><?label chevallier2000use?><mixed-citation>Chevallier, F., Morcrette, J.-J., Chéruy, F., and Scott, N.: Use of a neural-network-based long-wave radiative-transfer scheme in the ECMWF atmospheric model, Q. J. Roy. Meteor. Soc., 126, 761–776, <ext-link xlink:href="https://doi.org/10.1002/qj.49712656318" ext-link-type="DOI">10.1002/qj.49712656318</ext-link>, 2000.</mixed-citation></ref>
      <ref id="bib1.bibx18"><?xmltex \def\ref@label{{Chollet et~al.(2023)}}?><label>Chollet et al.(2023)</label><?label keras_github?><mixed-citation>Chollet, F. et al.: Keras (v2.14.0), GitHub [code], <uri>https://github.com/fchollet/keras</uri>, last access: 6 September 2023.</mixed-citation></ref>
      <ref id="bib1.bibx19"><?xmltex \def\ref@label{{Christensen and Zanna(2022)}}?><label>Christensen and Zanna(2022)</label><?label christensen22?><mixed-citation>Christensen, H. and Zanna, L.: Parametrization in Weather and Climate Models, in: Oxford Research Encyclopedia of Climate Science, Oxford University Press, ISBN 978-0-19-022862-0, <ext-link xlink:href="https://doi.org/10.1093/acrefore/9780190228620.013.826" ext-link-type="DOI">10.1093/acrefore/9780190228620.013.826</ext-link>, 2022.</mixed-citation></ref>
      <ref id="bib1.bibx20"><?xmltex \def\ref@label{{Curcic(2019)}}?><label>Curcic(2019)</label><?label curcic2019?><mixed-citation>Curcic, M.: A parallel Fortran framework for neural networks and deep learning, arXiv [preprint], <ext-link xlink:href="https://doi.org/10.48550/arXiv.1902.06714" ext-link-type="DOI">10.48550/arXiv.1902.06714</ext-link>, 2019.</mixed-citation></ref>
      <ref id="bib1.bibx21"><?xmltex \def\ref@label{{Dong et~al.(2023)Dong, Fritts, Liu, Lund, Liu, and Snively}}?><label>Dong et al.(2023)Dong, Fritts, Liu, Lund, Liu, and Snively</label><?label dong23?><mixed-citation>Dong, W., Fritts, D. C., Liu, A. Z., Lund, T. S., Liu, H.-L., and Snively, J.: Accelerating Atmospheric Gravity Wave Simulations Using Machine Learning: Kelvin-Helmholtz Instability and Mountain Wave Sources Driving Gravity Wave Breaking and Secondary Gravity Wave Generation, Geophys. Res. Lett., 50, e2023GL104668, <ext-link xlink:href="https://doi.org/10.1029/2023GL104668" ext-link-type="DOI">10.1029/2023GL104668</ext-link>, 2023.</mixed-citation></ref>
      <ref id="bib1.bibx22"><?xmltex \def\ref@label{{Dueben et~al.(2021)Dueben, Modigliani, Geer, Siemen, Pappenberger,
Bauer, Brown, Palkovic, Raoult, Wedi, and Baousis}}?><label>Dueben et al.(2021)Dueben, Modigliani, Geer, Siemen, Pappenberger, Bauer, Brown, Palkovic, Raoult, Wedi, and Baousis</label><?label dueben21?><mixed-citation>Dueben, P., Modigliani, U., Geer, A., Siemen, S., Pappenberger, F., Bauer, P., Brown, A., Palkovic, M., Raoult, B., Wedi, N., and Baousis, V.: Machine learning at ECMWF: A roadmap for the next 10 years, ECMWF Technical Memoranda, ECMWF, <ext-link xlink:href="https://doi.org/10.21957/ge7ckgm" ext-link-type="DOI">10.21957/ge7ckgm</ext-link>, 2021.</mixed-citation></ref>
      <ref id="bib1.bibx23"><?xmltex \def\ref@label{{Dueben et~al.(2022)Dueben, Schultz, Chantry, Gagne, Hall, and
McGovern}}?><label>Dueben et al.(2022)Dueben, Schultz, Chantry, Gagne, Hall, and McGovern</label><?label dueben22?><mixed-citation>Dueben, P. D., Schultz, M.<?pagebreak page4028?> G., Chantry, M., Gagne, D. J., Hall, D. M., and McGovern, A.: Challenges and Benchmark Datasets for Machine Learning in the Atmospheric Sciences: Definition, Status, and Outlook, Artificial Intelligence for the Earth Systems, 1, e210002, <ext-link xlink:href="https://doi.org/10.1175/AIES-D-21-0002.1" ext-link-type="DOI">10.1175/AIES-D-21-0002.1</ext-link>, 2022.</mixed-citation></ref>
      <ref id="bib1.bibx24"><?xmltex \def\ref@label{{Elafrou et~al.(2023)Elafrou, Orchard, and
Cliffard}}?><label>Elafrou et al.(2023)Elafrou, Orchard, and Cliffard</label><?label cambridge_icss_git?><mixed-citation>Elafrou, A., Orchard, D., and Cliffard, S.: fortran-pytorch-lib (commit: ffe833b66a6e1ce1c6cf023708d1f351a3a11f8b), GitHub [code], <uri>https://github.com/Cambridge-ICCS/fortran-pytorch-lib</uri>, last access: 6 September 2023.</mixed-citation></ref>
      <ref id="bib1.bibx25"><?xmltex \def\ref@label{{Gentine et~al.(2018)Gentine, Pritchard, Rasp, Reinaudi, and
Yacalis}}?><label>Gentine et al.(2018)Gentine, Pritchard, Rasp, Reinaudi, and Yacalis</label><?label gentine18?><mixed-citation>Gentine, P., Pritchard, M., Rasp, S., Reinaudi, G., and Yacalis, G.: Could Machine Learning Break the Convection Parameterization Deadlock?, Geophys. Res. Lett., 45, 5742–5751, <ext-link xlink:href="https://doi.org/10.1029/2018GL078202" ext-link-type="DOI">10.1029/2018GL078202</ext-link>, 2018.</mixed-citation></ref>
      <ref id="bib1.bibx26"><?xmltex \def\ref@label{{Grundner et~al.(2022)Grundner, Beucler, Gentine, Iglesias-Suarez,
Giorgetta, and Eyring}}?><label>Grundner et al.(2022)Grundner, Beucler, Gentine, Iglesias-Suarez, Giorgetta, and Eyring</label><?label grundner22?><mixed-citation>Grundner, A., Beucler, T., Gentine, P., Iglesias-Suarez, F., Giorgetta, M. A., and Eyring, V.: Deep Learning Based Cloud Cover Parameterization for ICON, J. Adv. Model. Earth Sy., 14, e2021MS002959, <ext-link xlink:href="https://doi.org/10.1029/2021MS002959" ext-link-type="DOI">10.1029/2021MS002959</ext-link>, 2022.</mixed-citation></ref>
      <ref id="bib1.bibx27"><?xmltex \def\ref@label{{Hanke et~al.(2016)Hanke, Redler, Holfeld, and Yastremsky}}?><label>Hanke et al.(2016)Hanke, Redler, Holfeld, and Yastremsky</label><?label hanke16?><mixed-citation>Hanke, M., Redler, R., Holfeld, T., and Yastremsky, M.: YAC 1.2.0: new aspects for coupling software in Earth system modelling, Geosci. Model Dev., 9, 2755–2769, <ext-link xlink:href="https://doi.org/10.5194/gmd-9-2755-2016" ext-link-type="DOI">10.5194/gmd-9-2755-2016</ext-link>, 2016.</mixed-citation></ref>
      <ref id="bib1.bibx28"><?xmltex \def\ref@label{{Hanke et~al.(2023)Hanke, Dreier, and Redler}}?><label>Hanke et al.(2023)Hanke, Dreier, and Redler</label><?label yac_software?><mixed-citation>Hanke, M., Dreier, N.-A., and Redler, R.: YetAnotherCoupler (YAC) (version 2.6.1) [code], <uri>https://dkrz-sw.gitlab-pages.dkrz.de/yac/</uri>, last access: 6 September 2023.</mixed-citation></ref>
      <ref id="bib1.bibx29"><?xmltex \def\ref@label{{Irrgang et~al.(2021)Irrgang, Boers, Sonnewald, Barnes, Kadow,
Staneva, and Saynisch-Wagner}}?><label>Irrgang et al.(2021)Irrgang, Boers, Sonnewald, Barnes, Kadow, Staneva, and Saynisch-Wagner</label><?label irrgang21?><mixed-citation>Irrgang, C., Boers, N., Sonnewald, M., Barnes, E. A., Kadow, C., Staneva, J., and Saynisch-Wagner, J.: Towards neural Earth system modelling by integrating artificial intelligence in Earth system science, Nature Machine Intelligence, 3, 667–674, <ext-link xlink:href="https://doi.org/10.1038/s42256-021-00374-3" ext-link-type="DOI">10.1038/s42256-021-00374-3</ext-link>, 2021.</mixed-citation></ref>
      <ref id="bib1.bibx30"><?xmltex \def\ref@label{{Krasnopolsky et~al.(2005)Krasnopolsky, Fox-Rabinovitz, and
Chalikov}}?><label>Krasnopolsky et al.(2005)Krasnopolsky, Fox-Rabinovitz, and Chalikov</label><?label krasnopolsky2005new?><mixed-citation>Krasnopolsky, V. M., Fox-Rabinovitz, M. S., and Chalikov, D. V.: New approach to calculation of atmospheric model physics: Accurate and fast neural network emulation of longwave radiation in a climate model, Mon. Weather Rev., 133, 1370–1383, <ext-link xlink:href="https://doi.org/10.1175/MWR2923.1" ext-link-type="DOI">10.1175/MWR2923.1</ext-link>, 2005.</mixed-citation></ref>
      <ref id="bib1.bibx31"><?xmltex \def\ref@label{{McGovern et~al.(2019)McGovern, Lagerquist, Gagne, Jergensen, Elmore,
Homeyer, and Smith}}?><label>McGovern et al.(2019)McGovern, Lagerquist, Gagne, Jergensen, Elmore, Homeyer, and Smith</label><?label mcgovern19?><mixed-citation>McGovern, A., Lagerquist, R., Gagne, D. J., Jergensen, G. E., Elmore, K. L., Homeyer, C. R., and Smith, T.: Making the Black Box More Transparent: Understanding the Physical Implications of Machine Learning, B. Am. Meteorol. Soc., 100, 2175–2199, <ext-link xlink:href="https://doi.org/10.1175/BAMS-D-18-0195.1" ext-link-type="DOI">10.1175/BAMS-D-18-0195.1</ext-link>, 2019.</mixed-citation></ref>
      <ref id="bib1.bibx32"><?xmltex \def\ref@label{{Meyer et~al.(2022)Meyer, Hogan, Dueben, and Mason}}?><label>Meyer et al.(2022)Meyer, Hogan, Dueben, and Mason</label><?label meyer22?><mixed-citation>Meyer, D., Hogan, R. J., Dueben, P. D., and Mason, S. L.: Machine Learning Emulation of 3D Cloud Radiative Effects, J. Adv. Model. Earth Sy., 14, e2021MS002550, <ext-link xlink:href="https://doi.org/10.1029/2021MS002550" ext-link-type="DOI">10.1029/2021MS002550</ext-link>, 2022.</mixed-citation></ref>
      <ref id="bib1.bibx33"><?xmltex \def\ref@label{{Morrison et~al.(2020)Morrison, Lier‐Walqui, Fridlind, Grabowski,
Harrington, Hoose, Korolev, Kumjian, Milbrandt, Pawlowska, Posselt, Prat,
Reimel, Shima, Diedenhoven, and Xue}}?><label>Morrison et al.(2020)Morrison, Lier‐Walqui, Fridlind, Grabowski, Harrington, Hoose, Korolev, Kumjian, Milbrandt, Pawlowska, Posselt, Prat, Reimel, Shima, Diedenhoven, and Xue</label><?label morrison_confronting_2020?><mixed-citation>Morrison, H., van Lier‐Walqui, M., Fridlind, A. M., Grabowski, W. W., Harrington, J. Y., Hoose, C., Korolev, A., Kumjian, M. R., Milbrandt, J. A., Pawlowska, H., Posselt, D. J., Prat, O. P., Reimel, K. J., Shima, S.-I., van Diedenhoven, B., and Xue, L.: Confronting the Challenge of Modeling Cloud and Precipitation Microphysics, J. Adv. Model. Earth Sy., 12, e2019MS001689, <ext-link xlink:href="https://doi.org/10.1029/2019MS001689" ext-link-type="DOI">10.1029/2019MS001689</ext-link>, 2020.</mixed-citation></ref>
      <ref id="bib1.bibx34"><?xmltex \def\ref@label{{Mu et~al.(2023)Mu, Chen, Yuan, and Qin}}?><label>Mu et al.(2023)Mu, Chen, Yuan, and Qin</label><?label fta23?><mixed-citation>Mu, B., Chen, L., Yuan, S., and Qin, B.: A radiative transfer deep learning model coupled into WRF with a generic fortran torch adaptor, Frontiers in Earth Science, 11, <ext-link xlink:href="https://doi.org/10.3389/feart.2023.1149566" ext-link-type="DOI">10.3389/feart.2023.1149566</ext-link>, 2023.</mixed-citation></ref>
      <ref id="bib1.bibx35"><?xmltex \def\ref@label{{Nowack et~al.(2018)Nowack, Braesicke, Haigh, Abraham, Pyle, and
Voulgarakis}}?><label>Nowack et al.(2018)Nowack, Braesicke, Haigh, Abraham, Pyle, and Voulgarakis</label><?label nowack18?><mixed-citation>Nowack, P., Braesicke, P., Haigh, J., Abraham, N. L., Pyle, J., and Voulgarakis, A.: Using machine learning to build temperature-based ozone parameterizations for climate sensitivity simulations, Environ. Res. Lett., 13, 104016, <ext-link xlink:href="https://doi.org/10.1088/1748-9326/aae2be" ext-link-type="DOI">10.1088/1748-9326/aae2be</ext-link>, 2018.</mixed-citation></ref>
      <ref id="bib1.bibx36"><?xmltex \def\ref@label{{Ott et~al.(2020)Ott, Pritchard, Best, Linstead, Curcic, and
Baldi}}?><label>Ott et al.(2020)Ott, Pritchard, Best, Linstead, Curcic, and Baldi</label><?label ott2020?><mixed-citation>Ott, J., Pritchard, M., Best, N., Linstead, E., Curcic, M., and Baldi, P.: A Fortran-Keras Deep Learning Bridge for Scientific Computing, Scientific Programming, 2020, 8888811, <ext-link xlink:href="https://doi.org/10.1155/2020/8888811" ext-link-type="DOI">10.1155/2020/8888811</ext-link>, 2020.</mixed-citation></ref>
      <ref id="bib1.bibx37"><?xmltex \def\ref@label{{Partee et~al.(2022)Partee, Ellis, Rigazzi, Shao, Bachman, Marques,
and Robbins}}?><label>Partee et al.(2022)Partee, Ellis, Rigazzi, Shao, Bachman, Marques, and Robbins</label><?label partee22?><mixed-citation>Partee, S., Ellis, M., Rigazzi, A., Shao, A. E., Bachman, S., Marques, G., and Robbins, B.: Using Machine Learning at scale in numerical simulations with SmartSim: An application to ocean climate modeling, J. Comput. Sci.-Neth., 62, 101707, <ext-link xlink:href="https://doi.org/10.1016/j.jocs.2022.101707" ext-link-type="DOI">10.1016/j.jocs.2022.101707</ext-link>, 2022.</mixed-citation></ref>
      <ref id="bib1.bibx38"><?xmltex \def\ref@label{{Paszke et~al.(2019)Paszke, Gross, Massa, Lerer, Bradbury, Chanan,
Killeen, Lin, Gimelshein, Antiga, Desmaison, Kopf, Yang, DeVito, Raison,
Tejani, Chilamkurthy, Steiner, Fang, Bai, and Chintala}}?><label>Paszke et al.(2019)Paszke, Gross, Massa, Lerer, Bradbury, Chanan, Killeen, Lin, Gimelshein, Antiga, Desmaison, Kopf, Yang, DeVito, Raison, Tejani, Chilamkurthy, Steiner, Fang, Bai, and Chintala</label><?label pytorch_neurips19?><mixed-citation>Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G., Killeen, T., Lin, Z., Gimelshein, N., Antiga, L., Desmaison, A., Kopf, A., Yang, E., DeVito, Z., Raison, M., Tejani, A., Chilamkurthy, S., Steiner, B., Fang, L., Bai, J., and Chintala, S.: PyTorch: An Imperative Style, High-Performance Deep Learning Library, in: Advances in Neural Information Processing Systems 32, Vancouver, Canada, 8–14 December 2019, Curran Associates, Inc., 8024–8035, <ext-link xlink:href="http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf">http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf</ext-link> (last access: 6 September 2023), 2019.</mixed-citation></ref>
      <ref id="bib1.bibx39"><?xmltex \def\ref@label{{Qu and Shi(2023)}}?><label>Qu and Shi(2023)</label><?label qu23?><mixed-citation>Qu, Y. and Shi, X.: Can a Machine Learning–Enabled Numerical Model Help Extend Effective Forecast Range through Consistently Trained Subgrid-Scale Models?, Artif. Intell. Earth Syst., 2, e220050, <ext-link xlink:href="https://doi.org/10.1175/AIES-D-22-0050.1" ext-link-type="DOI">10.1175/AIES-D-22-0050.1</ext-link>, 2023.</mixed-citation></ref>
      <ref id="bib1.bibx40"><?xmltex \def\ref@label{{Rasp(2020)}}?><label>Rasp(2020)</label><?label rasp20?><mixed-citation>Rasp, S.: Coupled online learning as a way to tackle instabilities and biases in neural network parameterizations: general algorithms and Lorenz 96 case study (v1.0), Geosci. Model Dev., 13, 2185–2196, <ext-link xlink:href="https://doi.org/10.5194/gmd-13-2185-2020" ext-link-type="DOI">10.5194/gmd-13-2185-2020</ext-link>, 2020.</mixed-citation></ref>
      <ref id="bib1.bibx41"><?xmltex \def\ref@label{{Rasp et~al.(2018)Rasp, Pritchard, and Gentine}}?><label>Rasp et al.(2018)Rasp, Pritchard, and Gentine</label><?label rasp18?><mixed-citation>Rasp, S., Pritchard, M. S., and Gentine, P.: Deep learning to represent subgrid processes in climate models, P. Natl. Acad. Sci. USA, 115, 9684–9689, <ext-link xlink:href="https://doi.org/10.1073/pnas.1810286115" ext-link-type="DOI">10.1073/pnas.1810286115</ext-link>, 2018.</mixed-citation></ref>
      <ref id="bib1.bibx42"><?xmltex \def\ref@label{{Rigo and Fijalkowski(2018)}}?><label>Rigo and Fijalkowski(2018)</label><?label cffi_software?><mixed-citation>Rigo, A. and Fijalkowski, M.: C Foreign Function Interface for Python, CFFI [code], <uri>https://cffi.readthedocs.io/en/release-1.14/</uri> (last access: 6 September 2023), 2018.</mixed-citation></ref>
      <ref id="bib1.bibx43"><?xmltex \def\ref@label{{Berkeley Lab(2023)}}?><label>Berkeley Lab(2023)</label><?label inferenceengine_github?><mixed-citation>Berkeley Lab: Inference Engine (v0.10.0), GitHub [code], <uri>https://github.com/BerkeleyLab/inference-engine/</uri>, 2023.</mixed-citation></ref>
      <ref id="bib1.bibx44"><?xmltex \def\ref@label{{Seifert and Beheng(2006)}}?><label>Seifert and Beheng(2006)</label><?label seifert2006?><mixed-citation>Seifert, A. and Beheng, K.: A two-moment cloud microphysics parameterization for mixed-phase clouds. Part 1: Model description, Meteorol. Atmos. Phys., 92, 45–66, <ext-link xlink:href="https://doi.org/10.1007/s00703-005-0112-4" ext-link-type="DOI">10.1007/s00703-005-0112-4</ext-link>, 2006.</mixed-citation></ref>
      <ref id="bib1.bibx45"><?xmltex \def\ref@label{{Seifert and Beheng(2001)}}?><label>Seifert and Beheng(2001)</label><?label Seifert_Beheng_2001?><mixed-citation>Seifert, A. and Beheng, K. D.: A double-moment parameterization for simulating autoconversion, accretion and selfcollection, Atmos. Res., 59–60, 265–281, <ext-link xlink:href="https://doi.org/10.1016/s0169-8095(01)00126-0" ext-link-type="DOI">10.1016/s0169-8095(01)00126-0</ext-link>, 2001.</mixed-citation></ref>
      <ref id="bib1.bibx46"><?xmltex \def\ref@label{{Seifert and Rasp(2020)}}?><label>Seifert and Rasp(2020)</label><?label seifert_potential_2020?><mixed-citation>Seifert, A. and Rasp, S.: Potential and Limitations of Machine Learning for Modeling Warm-Rain Cloud Microphysical Processes, J. Adv. Model. Earth Sy., 12, e2020MS002301, <ext-link xlink:href="https://doi.org/10.1029/2020MS002301" ext-link-type="DOI">10.1029/2020MS002301</ext-link>, 2020.</mixed-citation></ref>
      <ref id="bib1.bibx47"><?xmltex \def\ref@label{{Sharma and Greenberg(2024)}}?><label>Sharma and Greenberg(2024)</label><?label sharma_superdropnet_2024?><mixed-citation>Sharma, S. and Greenberg, D.: SuperdropNet: a Stable and Accurate Machine Learning Proxy for Droplet-based Cloud Microphysics, arXiv [preprint], <ext-link xlink:href="https://doi.org/10.48550/arXiv.2402.18354" ext-link-type="DOI">10.48550/arXiv.2402.18354</ext-link>, 2024.</mixed-citation></ref>
      <ref id="bib1.bibx48"><?xmltex \def\ref@label{{Shima et~al.(2009)Shima, Kusano, Kawano, Sugiyama, and
Kawahara}}?><label>Shima et al.(2009)Shima, Kusano, Kawano, Sugiyama, and Kawahara</label><?label shima_super-droplet_2009?><mixed-citation>Shima, S., Kusano, K., Kawano, A., Sugiyama, T., and Kawahara, S.: The super-droplet method for the numerical simulation of clouds and precipitation: a particle-based and probabilistic microphysics model coupled with a non-hydrostatic model, Q. J. Roy. Meteor. Soc., 135, 1307–1320, <ext-link xlink:href="https://doi.org/10.1002/qj.441" ext-link-type="DOI">10.1002/qj.441</ext-link>, 2009.</mixed-citation></ref>
      <?pagebreak page4029?><ref id="bib1.bibx49"><?xmltex \def\ref@label{{Sonnewald et~al.(2021)Sonnewald, Lguensat, Jones, Dueben, Brajard,
and Balaji}}?><label>Sonnewald et al.(2021)Sonnewald, Lguensat, Jones, Dueben, Brajard, and Balaji</label><?label sonnewald21?><mixed-citation>Sonnewald, M., Lguensat, R., Jones, D. C., Dueben, P. D., Brajard, J., and Balaji, V.: Bridging observations, theory and numerical simulation of the ocean using machine learning, Environ. Res. Lett., 16, 073008, <ext-link xlink:href="https://doi.org/10.1088/1748-9326/ac0eb0" ext-link-type="DOI">10.1088/1748-9326/ac0eb0</ext-link>, 2021.</mixed-citation></ref>
      <ref id="bib1.bibx50"><?xmltex \def\ref@label{{Yuval and O’Gorman(2023)}}?><label>Yuval and O’Gorman(2023)</label><?label yuval23?><mixed-citation>Yuval, J. and O’Gorman, P. A.: Neural-Network Parameterization of Subgrid Momentum Transport in the Atmosphere, J. Adv. Model. Earth Sy., 15, e2023MS003606, <ext-link xlink:href="https://doi.org/10.1029/2023MS003606" ext-link-type="DOI">10.1029/2023MS003606</ext-link>, 2023. </mixed-citation></ref><?xmltex \hack{\newpage}?>
      <ref id="bib1.bibx51"><?xmltex \def\ref@label{{Yuval et~al.(2021)Yuval, O'Gorman, and Hill}}?><label>Yuval et al.(2021)Yuval, O'Gorman, and Hill</label><?label yuval21?><mixed-citation>Yuval, J., O'Gorman, P. A., and Hill, C. N.: Use of Neural Networks for Stable, Accurate and Physically Consistent Parameterization of Subgrid Atmospheric Processes With Good Performance at Reduced Precision, Geophys. Res. Lett., 48, e2020GL091363, <ext-link xlink:href="https://doi.org/10.1029/2020GL091363" ext-link-type="DOI">10.1029/2020GL091363</ext-link>, 2021.</mixed-citation></ref>
      <ref id="bib1.bibx52"><?xmltex \def\ref@label{{Zhong et~al.(2023)Zhong, Ma, Yao, Xu, Wu, and Wang}}?><label>Zhong et al.(2023)Zhong, Ma, Yao, Xu, Wu, and Wang</label><?label zhong23?><mixed-citation>Zhong, X., Ma, Z., Yao, Y., Xu, L., Wu, Y., and Wang, Z.: WRF–ML v1.0: a bridge between WRF v4.3 and machine learning parameterizations and its application to atmospheric radiative transfer, Geosci. Model Dev., 16, 199–209, <ext-link xlink:href="https://doi.org/10.5194/gmd-16-199-2023" ext-link-type="DOI">10.5194/gmd-16-199-2023</ext-link>, 2023.</mixed-citation></ref>

  </ref-list></back>
    <!--<article-title-html>Efficient and stable coupling of the SuperdropNet deep-learning-based cloud microphysics (v0.1.0) with the ICON climate and weather model (v2.6.5)</article-title-html>
<abstract-html/>
<ref-html id="bib1.bib1"><label>Abadi et al.(2016)Abadi, Barham, Chen, Chen, Davis, Dean, Devin,
Ghemawat, Irving, Isard et al.</label><mixed-citation>
      
Abadi, M., Barham, P., Chen, J., Chen, Z., Davis, A., Dean, J., Devin, M.,
Ghemawat, S., Irving, G., Isard, M., Jia, Y., Jozefowicz, R., Kaiser, L.,
Kudlur, M., Levenberg, J., Mane, D., Monga, R., Moore, S.,
Murray, D., Olah, C., Schuster, M., Shlens, J., Steiner, B.,
Sutskever, I., Talwar, K., Tucker, P., Vanhoucke, V.,
Vasudevan, V., Viegas, F., Vinyals, O., Warden, P.,
Wattenberg, M., Wicke, M., Yu, Y., and Zheng, X.: Tensorflow: A system for
large-scale machine learning, in: 12th USENIX Symposium on Operating Systems
Design and Implementation (OSDI 16), Savannah, GA, US, 2–4 November 2016, 265–283,
<a href="https://arxiv.org/abs/1603.04467" target="_blank"/> (last access: 6 September 2023), 2016.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib2"><label>Alexeev, D.(2023)</label><mixed-citation>
      
Alexeev, D.: PyTorch bindings for Fortran (v0.4), GitHub [code],
<a href="https://github.com/alexeedm/pytorch-fortran" target="_blank"/> (last access: 6 September 2023), 2023.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib3"><label>Arnold et al.(2023a)</label><mixed-citation>
      
Arnold, C., Sharma, S., and Weigel, T.:  DKRZ-AIM/dkrz-hereon-icon-superdropnet: Integrating SuperdropNet (v0.1.0), Zenodo [code], <a href="https://doi.org/10.5281/zenodo.10069121" target="_blank">https://doi.org/10.5281/zenodo.10069121</a>, 2023a.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib4"><label>Arnold et al.(2023b)</label><mixed-citation>
      
Arnold, C., Sharma, S., and Weigel, T.: ICON Code v 2.6.5 including coupling schemes for integrating SuperdropNet, Zenodo [code], <a href="https://doi.org/10.5281/zenodo.8348256" target="_blank">https://doi.org/10.5281/zenodo.8348256</a>, 2023b.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib5"><label>Arnold et al.(2023c)</label><mixed-citation>
      
Arnold, C., Sharma, S., and Weigel, T.: Data set for: Efficient and Stable Coupling of the SuperdropNet Deep Learning-based Cloud Microphysics (v0.1.0) to the ICON Climate and Weather Model (v2.6.5), Zenodo [data set], <a href="https://doi.org/10.5281/zenodo.8348266" target="_blank">https://doi.org/10.5281/zenodo.8348266</a>, 2023c.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib6"><label>Belochitski and Krasnopolsky(2021)</label><mixed-citation>
      
Belochitski, A. and Krasnopolsky, V.: Robustness of neural network emulations of radiative transfer parameterizations in a state-of-the-art general circulation model, Geosci. Model Dev., 14, 7425–7437, <a href="https://doi.org/10.5194/gmd-14-7425-2021" target="_blank">https://doi.org/10.5194/gmd-14-7425-2021</a>, 2021.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib7"><label>Beucler et al.(2021)Beucler, Pritchard, Rasp, Ott, Baldi, and
Gentine</label><mixed-citation>
      
Beucler, T., Pritchard, M., Rasp, S., Ott, J., Baldi, P., and Gentine, P.:
Enforcing Analytic Constraints in Neural Networks Emulating
Physical Systems, Phys. Rev. Lett., 126, 098302,
<a href="https://doi.org/10.1103/PhysRevLett.126.098302" target="_blank">https://doi.org/10.1103/PhysRevLett.126.098302</a>, 2021.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib8"><label>Bonanni et al.(2022)Bonanni, Hawkes, and Quintino</label><mixed-citation>
      
Bonanni, A., Hawkes, J., and Quintino, T.: infero: A lower-level API for
Machine Learning inference in operations (version 0.2.0), GitHub [code],
<a href="https://github.com/ecmwf/infero" target="_blank"/> (last access: 6 September 2023), 2022.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib9"><label>Brdar and Seifert(2018)</label><mixed-citation>
      
Brdar, S. and Seifert, A.: McSnow: A Monte-Carlo Particle Model for
Riming and Aggregation of Ice Particles in a Multidimensional
Microphysical Phase Space, J. Adv. Model. Earth Sy., 10, 187–206,
<a href="https://doi.org/10.1002/2017MS001167" target="_blank">https://doi.org/10.1002/2017MS001167</a>, 2018.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib10"><label>Brenowitz(2023)</label><mixed-citation>
      
Brenowitz, N.: Call Python from Fortran (v0.2.1), Zenodo [code],
<a href="https://doi.org/10.5281/zenodo.7779572" target="_blank">https://doi.org/10.5281/zenodo.7779572</a>, 2023.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib11"><label>Brenowitz and Bretherton(2018)</label><mixed-citation>
      
Brenowitz, N. D. and Bretherton, C. S.: Prognostic Validation of a Neural
Network Unified Physics Parameterization, Geophys. Res. Lett., 45,
6289–6298, <a href="https://doi.org/10.1029/2018GL078510" target="_blank">https://doi.org/10.1029/2018GL078510</a>, 2018.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib12"><label>Brenowitz and Bretherton(2019)</label><mixed-citation>
      
Brenowitz, N. D. and Bretherton, C. S.: Spatially Extended Tests of a Neural
Network Parametrization Trained by Coarse-Graining, J. Adv. Model. Earth
Sy., 11, 2728–2744, <a href="https://doi.org/10.1029/2019MS001711" target="_blank">https://doi.org/10.1029/2019MS001711</a>, 2019.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib13"><label>Brenowitz et al.(2020a)Brenowitz, Beucler, Pritchard,
and Bretherton</label><mixed-citation>
      
Brenowitz, N. D., Beucler, T., Pritchard, M., and Bretherton, C. S.:
Interpreting and Stabilizing Machine-Learning Parametrizations of
Convection, J. Atmos. Sci., 77, 4357–4375, <a href="https://doi.org/10.1175/JAS-D-20-0082.1" target="_blank">https://doi.org/10.1175/JAS-D-20-0082.1</a>,
2020a.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib14"><label>Brenowitz et al.(2020b)Brenowitz, Henn, McGibbon, Clark,
Kwa, Perkins, Watt-Meyer, and Bretherton</label><mixed-citation>
      
Brenowitz, N. D., Henn, B., McGibbon, J., Clark, S. K., Kwa, A., Perkins,
W. A., Watt-Meyer, O., and Bretherton, C. S.: Machine Learning Climate Model
Dynamics: Offline versus Online Performance, arXiv [preprint], <a href="https://doi.org/10.48550/arXiv.2011.03081" target="_blank">https://doi.org/10.48550/arXiv.2011.03081</a>,
2020b.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib15"><label>Brenowitz et al.(2022)Brenowitz, Perkins, Nugent, Watt-Meyer, Clark,
Kwa, Henn, McGibbon, and Bretherton</label><mixed-citation>
      
Brenowitz, N. D., Perkins, W. A., Nugent, J. M., Watt-Meyer, O., Clark, S. K.,
Kwa, A., Henn, B., McGibbon, J., and Bretherton, C. S.: Emulating Fast
Processes in Climate Models, in: Machine Learning for the Physical Sciences,
NEURIPS Workshop, arXiv [preprint],  <a href="https://doi.org/10.48550/arXiv.2211.10774" target="_blank">https://doi.org/10.48550/arXiv.2211.10774</a>, 2022.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib16"><label>Chantry et al.(2021)Chantry, Hatfield, Dueben, Polichtchouk, and
Palmer</label><mixed-citation>
      
Chantry, M., Hatfield, S., Dueben, P., Polichtchouk, I., and Palmer, T.:
Machine learning emulation of gravity wave drag in numerical weather
forecasting, J. Adv. Model. Earth Sy., 13, e2021MS002477,
<a href="https://doi.org/10.1029/2021MS002477" target="_blank">https://doi.org/10.1029/2021MS002477</a>, 2021.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib17"><label>Chevallier et al.(2000)Chevallier, Morcrette, Chéruy, and
Scott</label><mixed-citation>
      
Chevallier, F., Morcrette, J.-J., Chéruy, F., and Scott, N.: Use of a
neural-network-based long-wave radiative-transfer scheme in the ECMWF
atmospheric model, Q. J. Roy. Meteor. Soc.,
126, 761–776, <a href="https://doi.org/10.1002/qj.49712656318" target="_blank">https://doi.org/10.1002/qj.49712656318</a>, 2000.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib18"><label>Chollet et al.(2023)</label><mixed-citation>
      
Chollet, F. et al.: Keras (v2.14.0), GitHub [code],
<a href="https://github.com/fchollet/keras" target="_blank"/>, last access: 6 September 2023.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib19"><label>Christensen and Zanna(2022)</label><mixed-citation>
      
Christensen, H. and Zanna, L.: Parametrization in Weather and Climate
Models, in: Oxford Research Encyclopedia of Climate Science, Oxford University Press, ISBN
978-0-19-022862-0, <a href="https://doi.org/10.1093/acrefore/9780190228620.013.826" target="_blank">https://doi.org/10.1093/acrefore/9780190228620.013.826</a>, 2022.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib20"><label>Curcic(2019)</label><mixed-citation>
      
Curcic, M.: A parallel Fortran framework for neural networks and deep
learning, arXiv [preprint], <a href="https://doi.org/10.48550/arXiv.1902.06714" target="_blank">https://doi.org/10.48550/arXiv.1902.06714</a>, 2019.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib21"><label>Dong et al.(2023)Dong, Fritts, Liu, Lund, Liu, and Snively</label><mixed-citation>
      
Dong, W., Fritts, D. C., Liu, A. Z., Lund, T. S., Liu, H.-L., and Snively, J.:
Accelerating Atmospheric Gravity Wave Simulations Using Machine
Learning: Kelvin-Helmholtz Instability and Mountain Wave
Sources Driving Gravity Wave Breaking and Secondary Gravity
Wave Generation, Geophys. Res. Lett., 50, e2023GL104668, <a href="https://doi.org/10.1029/2023GL104668" target="_blank">https://doi.org/10.1029/2023GL104668</a>,
2023.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib22"><label>Dueben et al.(2021)Dueben, Modigliani, Geer, Siemen, Pappenberger,
Bauer, Brown, Palkovic, Raoult, Wedi, and Baousis</label><mixed-citation>
      
Dueben, P., Modigliani, U., Geer, A., Siemen, S., Pappenberger, F., Bauer, P.,
Brown, A., Palkovic, M., Raoult, B., Wedi, N., and Baousis, V.: Machine
learning at ECMWF: A roadmap for the next 10 years, ECMWF Technical
Memoranda, ECMWF, <a href="https://doi.org/10.21957/ge7ckgm" target="_blank">https://doi.org/10.21957/ge7ckgm</a>, 2021.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib23"><label>Dueben et al.(2022)Dueben, Schultz, Chantry, Gagne, Hall, and
McGovern</label><mixed-citation>
      
Dueben, P. D., Schultz, M. G., Chantry, M., Gagne, D. J., Hall, D. M., and
McGovern, A.: Challenges and Benchmark Datasets for Machine Learning
in the Atmospheric Sciences: Definition, Status, and Outlook,
Artificial Intelligence for the Earth Systems, 1, e210002,
<a href="https://doi.org/10.1175/AIES-D-21-0002.1" target="_blank">https://doi.org/10.1175/AIES-D-21-0002.1</a>, 2022.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib24"><label>Elafrou et al.(2023)Elafrou, Orchard, and
Cliffard</label><mixed-citation>
      
Elafrou, A., Orchard, D., and Cliffard, S.: fortran-pytorch-lib (commit:
ffe833b66a6e1ce1c6cf023708d1f351a3a11f8b), GitHub [code],
<a href="https://github.com/Cambridge-ICCS/fortran-pytorch-lib" target="_blank"/>, last access: 6 September 2023.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib25"><label>Gentine et al.(2018)Gentine, Pritchard, Rasp, Reinaudi, and
Yacalis</label><mixed-citation>
      
Gentine, P., Pritchard, M., Rasp, S., Reinaudi, G., and Yacalis, G.: Could
Machine Learning Break the Convection Parameterization Deadlock?,
Geophys. Res. Lett., 45, 5742–5751, <a href="https://doi.org/10.1029/2018GL078202" target="_blank">https://doi.org/10.1029/2018GL078202</a>, 2018.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib26"><label>Grundner et al.(2022)Grundner, Beucler, Gentine, Iglesias-Suarez,
Giorgetta, and Eyring</label><mixed-citation>
      
Grundner, A., Beucler, T., Gentine, P., Iglesias-Suarez, F., Giorgetta, M. A.,
and Eyring, V.: Deep Learning Based Cloud Cover Parameterization
for ICON, J. Adv. Model. Earth Sy., 14, e2021MS002959,
<a href="https://doi.org/10.1029/2021MS002959" target="_blank">https://doi.org/10.1029/2021MS002959</a>, 2022.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib27"><label>Hanke et al.(2016)Hanke, Redler, Holfeld, and Yastremsky</label><mixed-citation>
      
Hanke, M., Redler, R., Holfeld, T., and Yastremsky, M.: YAC 1.2.0: new aspects for coupling software in Earth system modelling, Geosci. Model Dev., 9, 2755–2769, <a href="https://doi.org/10.5194/gmd-9-2755-2016" target="_blank">https://doi.org/10.5194/gmd-9-2755-2016</a>, 2016.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib28"><label>Hanke et al.(2023)Hanke, Dreier, and Redler</label><mixed-citation>
      
Hanke, M., Dreier, N.-A., and Redler, R.: YetAnotherCoupler (YAC) (version
2.6.1) [code], <a href="https://dkrz-sw.gitlab-pages.dkrz.de/yac/" target="_blank"/>, last access: 6 September 2023.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib29"><label>Irrgang et al.(2021)Irrgang, Boers, Sonnewald, Barnes, Kadow,
Staneva, and Saynisch-Wagner</label><mixed-citation>
      
Irrgang, C., Boers, N., Sonnewald, M., Barnes, E. A., Kadow, C., Staneva, J.,
and Saynisch-Wagner, J.: Towards neural Earth system modelling by
integrating artificial intelligence in Earth system science, Nature Machine
Intelligence, 3, 667–674, <a href="https://doi.org/10.1038/s42256-021-00374-3" target="_blank">https://doi.org/10.1038/s42256-021-00374-3</a>, 2021.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib30"><label>Krasnopolsky et al.(2005)Krasnopolsky, Fox-Rabinovitz, and
Chalikov</label><mixed-citation>
      
Krasnopolsky, V. M., Fox-Rabinovitz, M. S., and Chalikov, D. V.: New approach
to calculation of atmospheric model physics: Accurate and fast neural network
emulation of longwave radiation in a climate model, Mon. Weather Rev.,
133, 1370–1383, <a href="https://doi.org/10.1175/MWR2923.1" target="_blank">https://doi.org/10.1175/MWR2923.1</a>, 2005.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib31"><label>McGovern et al.(2019)McGovern, Lagerquist, Gagne, Jergensen, Elmore,
Homeyer, and Smith</label><mixed-citation>
      
McGovern, A., Lagerquist, R., Gagne, D. J., Jergensen, G. E., Elmore, K. L.,
Homeyer, C. R., and Smith, T.: Making the Black Box More Transparent:
Understanding the Physical Implications of Machine Learning, B. Am.
Meteorol. Soc., 100, 2175–2199, <a href="https://doi.org/10.1175/BAMS-D-18-0195.1" target="_blank">https://doi.org/10.1175/BAMS-D-18-0195.1</a>, 2019.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib32"><label>Meyer et al.(2022)Meyer, Hogan, Dueben, and Mason</label><mixed-citation>
      
Meyer, D., Hogan, R. J., Dueben, P. D., and Mason, S. L.: Machine Learning
Emulation of 3D Cloud Radiative Effects, J. Adv. Model. Earth Sy.,
14, e2021MS002550, <a href="https://doi.org/10.1029/2021MS002550" target="_blank">https://doi.org/10.1029/2021MS002550</a>, 2022.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib33"><label>Morrison et al.(2020)Morrison, Lier‐Walqui, Fridlind, Grabowski,
Harrington, Hoose, Korolev, Kumjian, Milbrandt, Pawlowska, Posselt, Prat,
Reimel, Shima, Diedenhoven, and Xue</label><mixed-citation>
      
Morrison, H., van Lier‐Walqui, M., Fridlind, A. M., Grabowski, W. W.,
Harrington, J. Y., Hoose, C., Korolev, A., Kumjian, M. R., Milbrandt, J. A.,
Pawlowska, H., Posselt, D. J., Prat, O. P., Reimel, K. J., Shima, S.-I.,
van Diedenhoven, B., and Xue, L.: Confronting the Challenge of Modeling
Cloud and Precipitation Microphysics, J. Adv. Model. Earth Sy., 12,
e2019MS001689, <a href="https://doi.org/10.1029/2019MS001689" target="_blank">https://doi.org/10.1029/2019MS001689</a>, 2020.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib34"><label>Mu et al.(2023)Mu, Chen, Yuan, and Qin</label><mixed-citation>
      
Mu, B., Chen, L., Yuan, S., and Qin, B.: A radiative transfer deep learning
model coupled into WRF with a generic fortran torch adaptor, Frontiers in
Earth Science, 11, <a href="https://doi.org/10.3389/feart.2023.1149566" target="_blank">https://doi.org/10.3389/feart.2023.1149566</a>, 2023.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib35"><label>Nowack et al.(2018)Nowack, Braesicke, Haigh, Abraham, Pyle, and
Voulgarakis</label><mixed-citation>
      
Nowack, P., Braesicke, P., Haigh, J., Abraham, N. L., Pyle, J., and
Voulgarakis, A.: Using machine learning to build temperature-based ozone
parameterizations for climate sensitivity simulations, Environ. Res. Lett.,
13, 104016, <a href="https://doi.org/10.1088/1748-9326/aae2be" target="_blank">https://doi.org/10.1088/1748-9326/aae2be</a>, 2018.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib36"><label>Ott et al.(2020)Ott, Pritchard, Best, Linstead, Curcic, and
Baldi</label><mixed-citation>
      
Ott, J., Pritchard, M., Best, N., Linstead, E., Curcic, M., and Baldi, P.: A
Fortran-Keras Deep Learning Bridge for Scientific Computing, Scientific
Programming, 2020, 8888811, <a href="https://doi.org/10.1155/2020/8888811" target="_blank">https://doi.org/10.1155/2020/8888811</a>, 2020.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib37"><label>Partee et al.(2022)Partee, Ellis, Rigazzi, Shao, Bachman, Marques,
and Robbins</label><mixed-citation>
      
Partee, S., Ellis, M., Rigazzi, A., Shao, A. E., Bachman, S., Marques, G., and
Robbins, B.: Using Machine Learning at scale in numerical simulations with
SmartSim: An application to ocean climate modeling, J. Comput. Sci.-Neth.,
62, 101707, <a href="https://doi.org/10.1016/j.jocs.2022.101707" target="_blank">https://doi.org/10.1016/j.jocs.2022.101707</a>, 2022.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib38"><label>Paszke et al.(2019)Paszke, Gross, Massa, Lerer, Bradbury, Chanan,
Killeen, Lin, Gimelshein, Antiga, Desmaison, Kopf, Yang, DeVito, Raison,
Tejani, Chilamkurthy, Steiner, Fang, Bai, and Chintala</label><mixed-citation>
      
Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G., Killeen,
T., Lin, Z., Gimelshein, N., Antiga, L., Desmaison, A., Kopf, A., Yang, E.,
DeVito, Z., Raison, M., Tejani, A., Chilamkurthy, S., Steiner, B., Fang, L.,
Bai, J., and Chintala, S.: PyTorch: An Imperative Style, High-Performance
Deep Learning Library, in: Advances in Neural Information Processing Systems 32, Vancouver, Canada, 8–14 December 2019, Curran Associates, Inc.,
8024–8035,
<a href="http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf" target="_blank">http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf</a> (last access: 6 September 2023),
2019.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib39"><label>Qu and Shi(2023)</label><mixed-citation>
      
Qu, Y. and Shi, X.: Can a Machine Learning–Enabled Numerical Model
Help Extend Effective Forecast Range through Consistently
Trained Subgrid-Scale Models?, Artif. Intell. Earth
Syst., 2, e220050, <a href="https://doi.org/10.1175/AIES-D-22-0050.1" target="_blank">https://doi.org/10.1175/AIES-D-22-0050.1</a>, 2023.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib40"><label>Rasp(2020)</label><mixed-citation>
      
Rasp, S.: Coupled online learning as a way to tackle instabilities and biases in neural network parameterizations: general algorithms and Lorenz 96 case study (v1.0), Geosci. Model Dev., 13, 2185–2196, <a href="https://doi.org/10.5194/gmd-13-2185-2020" target="_blank">https://doi.org/10.5194/gmd-13-2185-2020</a>, 2020.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib41"><label>Rasp et al.(2018)Rasp, Pritchard, and Gentine</label><mixed-citation>
      
Rasp, S., Pritchard, M. S., and Gentine, P.: Deep learning to represent subgrid
processes in climate models, P. Natl. Acad. Sci. USA,
115, 9684–9689, <a href="https://doi.org/10.1073/pnas.1810286115" target="_blank">https://doi.org/10.1073/pnas.1810286115</a>, 2018.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib42"><label>Rigo and Fijalkowski(2018)</label><mixed-citation>
      
Rigo, A. and Fijalkowski, M.: C Foreign Function Interface for Python, CFFI [code],
<a href="https://cffi.readthedocs.io/en/release-1.14/" target="_blank"/> (last access: 6 September 2023), 2018.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib43"><label>Berkeley Lab(2023)</label><mixed-citation>
      
Berkeley
Lab: Inference Engine (v0.10.0), GitHub [code],
<a href="https://github.com/BerkeleyLab/inference-engine/" target="_blank"/>, 2023.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib44"><label>Seifert and Beheng(2006)</label><mixed-citation>
      
Seifert, A. and Beheng, K.: A two-moment cloud microphysics parameterization
for mixed-phase clouds. Part 1: Model description, Meteorol. Atmos. Phys., 92,
45–66, <a href="https://doi.org/10.1007/s00703-005-0112-4" target="_blank">https://doi.org/10.1007/s00703-005-0112-4</a>, 2006.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib45"><label>Seifert and Beheng(2001)</label><mixed-citation>
      
Seifert, A. and Beheng, K. D.: A double-moment parameterization for simulating
autoconversion, accretion and selfcollection, Atmos. Res., 59–60,
265–281, <a href="https://doi.org/10.1016/s0169-8095(01)00126-0" target="_blank">https://doi.org/10.1016/s0169-8095(01)00126-0</a>, 2001.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib46"><label>Seifert and Rasp(2020)</label><mixed-citation>
      
Seifert, A. and Rasp, S.: Potential and Limitations of Machine Learning
for Modeling Warm-Rain Cloud Microphysical Processes, J. Adv.
Model. Earth Sy., 12, e2020MS002301, <a href="https://doi.org/10.1029/2020MS002301" target="_blank">https://doi.org/10.1029/2020MS002301</a>, 2020.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib47"><label>Sharma and Greenberg(2024)</label><mixed-citation>
      
Sharma, S. and Greenberg, D.: SuperdropNet: a Stable and Accurate Machine
Learning Proxy for Droplet-based Cloud Microphysics, arXiv [preprint],
<a href="https://doi.org/10.48550/arXiv.2402.18354" target="_blank">https://doi.org/10.48550/arXiv.2402.18354</a>, 2024.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib48"><label>Shima et al.(2009)Shima, Kusano, Kawano, Sugiyama, and
Kawahara</label><mixed-citation>
      
Shima, S., Kusano, K., Kawano, A., Sugiyama, T., and Kawahara, S.: The
super-droplet method for the numerical simulation of clouds and
precipitation: a particle-based and probabilistic microphysics model coupled
with a non-hydrostatic model, Q. J. Roy. Meteor. Soc., 135, 1307–1320,
<a href="https://doi.org/10.1002/qj.441" target="_blank">https://doi.org/10.1002/qj.441</a>, 2009.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib49"><label>Sonnewald et al.(2021)Sonnewald, Lguensat, Jones, Dueben, Brajard,
and Balaji</label><mixed-citation>
      
Sonnewald, M., Lguensat, R., Jones, D. C., Dueben, P. D., Brajard, J., and
Balaji, V.: Bridging observations, theory and numerical simulation of the
ocean using machine learning, Environ. Res. Lett., 16, 073008,
<a href="https://doi.org/10.1088/1748-9326/ac0eb0" target="_blank">https://doi.org/10.1088/1748-9326/ac0eb0</a>, 2021.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib50"><label>Yuval and O’Gorman(2023)</label><mixed-citation>
      
Yuval, J. and O’Gorman, P. A.: Neural-Network Parameterization of
Subgrid Momentum Transport in the Atmosphere, J. Adv. Model. Earth
Sy., 15, e2023MS003606, <a href="https://doi.org/10.1029/2023MS003606" target="_blank">https://doi.org/10.1029/2023MS003606</a>, 2023.


    </mixed-citation></ref-html>
<ref-html id="bib1.bib51"><label>Yuval et al.(2021)Yuval, O'Gorman, and Hill</label><mixed-citation>
      
Yuval, J., O'Gorman, P. A., and Hill, C. N.: Use of Neural Networks for
Stable, Accurate and Physically Consistent Parameterization of
Subgrid Atmospheric Processes With Good Performance at Reduced
Precision, Geophys. Res. Lett., 48, e2020GL091363,
<a href="https://doi.org/10.1029/2020GL091363" target="_blank">https://doi.org/10.1029/2020GL091363</a>, 2021.

    </mixed-citation></ref-html>
<ref-html id="bib1.bib52"><label>Zhong et al.(2023)Zhong, Ma, Yao, Xu, Wu, and Wang</label><mixed-citation>
      
Zhong, X., Ma, Z., Yao, Y., Xu, L., Wu, Y., and Wang, Z.: WRF–ML v1.0: a bridge between WRF v4.3 and machine learning parameterizations and its application to atmospheric radiative transfer, Geosci. Model Dev., 16, 199–209, <a href="https://doi.org/10.5194/gmd-16-199-2023" target="_blank">https://doi.org/10.5194/gmd-16-199-2023</a>, 2023.

    </mixed-citation></ref-html>--></article>
