Submitted as: model experiment description paper 31 May 2021

Submitted as: model experiment description paper | 31 May 2021

Review status: this preprint is currently under review for the journal GMD.

Robustness of Neural Network Emulations of Radiative Transfer Parameterizations in a State-of-the-Art General Circulation Model

Alexei Belochitski1,2 and Vladimir Krasnopolsky2 Alexei Belochitski and Vladimir Krasnopolsky
  • 1IMSG, Rockville, MD 20852, USA
  • 2NOAA/NWS/NCEP/EMC, College Park, MD 20740, USA

Abstract. The ability of Machine-Learning (ML) based model components to generalize to the previously unseen inputs, and the resulting stability of the models that use these components, has been receiving a lot of recent attention, especially when it comes to ML-based parameterizations. At the same time, ML-based emulators of existing parameterizations can be stable, accurate, and fast when used in the model they were specifically designed for. In this work we show that shallow-neural-network-based emulators of radiative transfer parameterizations developed almost a decade ago for a state-of-the-art GCM are robust with respect to the substantial structural and parametric change in the host model: when used in two seven month-long experiments with the new model, they not only remain stable, but generate realistic output. Aspects of neural network architecture and training set design potentially contributing to stability of ML-based model components are discussed.

Alexei Belochitski and Vladimir Krasnopolsky

Status: final response (author comments only)

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
  • RC1: 'Comment on gmd-2021-114', Edoardo Bucchignani, 23 Jun 2021
  • RC2: 'Comment on gmd-2021-114', Anonymous Referee #2, 28 Jun 2021
  • AC1: 'Author's Comments - Reply to reviewers', Alexei Belochitski, 03 Sep 2021

Alexei Belochitski and Vladimir Krasnopolsky

Alexei Belochitski and Vladimir Krasnopolsky


Total article views: 377 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
267 101 9 377 0 1
  • HTML: 267
  • PDF: 101
  • XML: 9
  • Total: 377
  • BibTeX: 0
  • EndNote: 1
Views and downloads (calculated since 31 May 2021)
Cumulative views and downloads (calculated since 31 May 2021)

Viewed (geographical distribution)

Total article views: 285 (including HTML, PDF, and XML) Thereof 285 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
Latest update: 18 Sep 2021
Short summary
There is a lot interest in using machine learning (ML) techniques to improve environmental models by replacing physically-based model components with ML-derived ones. The latter ordinarily demonstrate excellent results when tested in a stand-alone setting, but can break their host model either outright when coupled to it, or eventually when the model changes. We built an ML-component that not only does not destabilize its host model but is robust with respect to substantial changes in it.