Preprints
https://doi.org/10.5194/gmd-2022-264
https://doi.org/10.5194/gmd-2022-264
Submitted as: development and technical paper
19 Dec 2022
Submitted as: development and technical paper | 19 Dec 2022
Status: this preprint is currently under review for the journal GMD.

A learning-based method for efficient large-scale sensitivity analysis and tuning of single column atmosphere model (SCAM)

Jiaxu Guo1,6, Yidan Xu5,7, Haohuan Fu2,6, Wei Xue3,6, Lanning Wang4,6, Lin Gan3,6, Xianwei Wu1,6, Liang Hu1, Gaochao Xu1, and Xilong Che1 Jiaxu Guo et al.
  • 1College of Computer Science and Technology, Jilin University, Changchun, China
  • 2Department of Earth System Science, Ministry of Education Key Laboratory for Earth System Modeling, Tsinghua University, Beijing, China
  • 3Department of Computer Science and Technology, Tsinghua University, Beijing, China
  • 4College of Global Change and Earth System Science, Beijing Normal University, Beijing, China
  • 5China Reinsurance (Group) Corporation, Beijing, China
  • 6National Supercomputing Center in Wuxi, Wuxi, China
  • 7School of Environment and Nature Resources, Renmin University of China, Beijing, China

Abstract. The Single Column Atmospheric Model (SCAM) is an essential tool for analyzing and improving the physics schemes of CAM. Although it already largely reduces the compute cost from a complete CAM, the exponentially-growing parameter space makes a combined analysis or tuning of multiple parameters difficult. In this paper, we propose a hybrid framework that combines parallel execution and a learning-based surrogate model, to support large-scale sensitivity analysis (SA) and tuning of combinations of multiple parameters. We start with a workflow (with modifications to the original SCAM) to support the execution and assembly of a large number of sampling, sensitivity analysis, and tuning tasks. By reusing the 3,840 instances with the variation of 11 parameters, we train a neural network (NN) based surrogate model that achieves both accuracy and efficiency (with the computational cost reduced by several orders of magnitude). The improved balance between cost and accuracy enables us to integrate NN-based grid search into the traditional optimization methods to achieve better optimization results with fewer compute cycles. Using such a hybrid framework, we explore the joint sensitivity of multi-parameter combinations to multiple cases using a set of three parameters, identify the most sensitive three-parameter combination out of eleven, and perform a tuning process that reduces the error of precipitation by 5 % to 15 % in different cases.

Jiaxu Guo et al.

Status: open (until 13 Feb 2023)

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse

Jiaxu Guo et al.

Jiaxu Guo et al.

Viewed

Total article views: 310 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
268 35 7 310 3 3
  • HTML: 268
  • PDF: 35
  • XML: 7
  • Total: 310
  • BibTeX: 3
  • EndNote: 3
Views and downloads (calculated since 19 Dec 2022)
Cumulative views and downloads (calculated since 19 Dec 2022)

Viewed (geographical distribution)

Total article views: 295 (including HTML, PDF, and XML) Thereof 295 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 28 Jan 2023
Download
Short summary
To further improve the efficiency of experiments using SCAM, we train a neural network-based surrogate model to support large-scale sensitivity analysis and tuning of combinations of multiple parameters. Using a hybrid method, we explore the joint sensitivity of multi-parameter combinations to typical cases and identify the most sensitive three-parameter combination out of eleven, and perform a tuning process that reduces the error of precipitation in these cases.