Preprints
https://doi.org/10.5194/gmd-2021-114
https://doi.org/10.5194/gmd-2021-114

Submitted as: model experiment description paper 31 May 2021

Submitted as: model experiment description paper | 31 May 2021

Review status: this preprint is currently under review for the journal GMD.

Robustness of Neural Network Emulations of Radiative Transfer Parameterizations in a State-of-the-Art General Circulation Model

Alexei Belochitski1,2 and Vladimir Krasnopolsky2 Alexei Belochitski and Vladimir Krasnopolsky
  • 1IMSG, Rockville, MD 20852, USA
  • 2NOAA/NWS/NCEP/EMC, College Park, MD 20740, USA

Abstract. The ability of Machine-Learning (ML) based model components to generalize to the previously unseen inputs, and the resulting stability of the models that use these components, has been receiving a lot of recent attention, especially when it comes to ML-based parameterizations. At the same time, ML-based emulators of existing parameterizations can be stable, accurate, and fast when used in the model they were specifically designed for. In this work we show that shallow-neural-network-based emulators of radiative transfer parameterizations developed almost a decade ago for a state-of-the-art GCM are robust with respect to the substantial structural and parametric change in the host model: when used in two seven month-long experiments with the new model, they not only remain stable, but generate realistic output. Aspects of neural network architecture and training set design potentially contributing to stability of ML-based model components are discussed.

Alexei Belochitski and Vladimir Krasnopolsky

Status: open (until 26 Jul 2021)

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse

Alexei Belochitski and Vladimir Krasnopolsky

Alexei Belochitski and Vladimir Krasnopolsky

Viewed

Total article views: 185 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
133 50 2 185 0 0
  • HTML: 133
  • PDF: 50
  • XML: 2
  • Total: 185
  • BibTeX: 0
  • EndNote: 0
Views and downloads (calculated since 31 May 2021)
Cumulative views and downloads (calculated since 31 May 2021)

Viewed (geographical distribution)

Total article views: 124 (including HTML, PDF, and XML) Thereof 124 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 15 Jun 2021
Download
Short summary
There is a lot interest in using machine learning (ML) techniques to improve environmental models by replacing physically-based model components with ML-derived ones. The latter ordinarily demonstrate excellent results when tested in a stand-alone setting, but can break their host model either outright when coupled to it, or eventually when the model changes. We built an ML-component that not only does not destabilize its host model but is robust with respect to substantial changes in it.