Preprints
https://doi.org/10.5194/gmd-2022-264
https://doi.org/10.5194/gmd-2022-264
Submitted as: development and technical paper
 | 
19 Dec 2022
Submitted as: development and technical paper |  | 19 Dec 2022
Status: this preprint is currently under review for the journal GMD.

A learning-based method for efficient large-scale sensitivity analysis and tuning of single column atmosphere model (SCAM)

Jiaxu Guo, Yidan Xu, Haohuan Fu, Wei Xue, Lanning Wang, Lin Gan, Xianwei Wu, Liang Hu, Gaochao Xu, and Xilong Che

Abstract. The Single Column Atmospheric Model (SCAM) is an essential tool for analyzing and improving the physics schemes of CAM. Although it already largely reduces the compute cost from a complete CAM, the exponentially-growing parameter space makes a combined analysis or tuning of multiple parameters difficult. In this paper, we propose a hybrid framework that combines parallel execution and a learning-based surrogate model, to support large-scale sensitivity analysis (SA) and tuning of combinations of multiple parameters. We start with a workflow (with modifications to the original SCAM) to support the execution and assembly of a large number of sampling, sensitivity analysis, and tuning tasks. By reusing the 3,840 instances with the variation of 11 parameters, we train a neural network (NN) based surrogate model that achieves both accuracy and efficiency (with the computational cost reduced by several orders of magnitude). The improved balance between cost and accuracy enables us to integrate NN-based grid search into the traditional optimization methods to achieve better optimization results with fewer compute cycles. Using such a hybrid framework, we explore the joint sensitivity of multi-parameter combinations to multiple cases using a set of three parameters, identify the most sensitive three-parameter combination out of eleven, and perform a tuning process that reduces the error of precipitation by 5 % to 15 % in different cases.

Jiaxu Guo et al.

Status: final response (author comments only)

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
  • RC1: 'Comment on gmd-2022-264', Anonymous Referee #1, 14 Jan 2023
    • AC1: 'Reply on RC1', Jiaxu Guo, 18 Apr 2023
  • RC2: 'Comment on gmd-2022-264', Anonymous Referee #2, 17 Mar 2023
    • AC2: 'Reply on RC2', Jiaxu Guo, 18 Apr 2023
  • AC3: 'Final author comment on gmd-2022-264', Jiaxu Guo, 18 Apr 2023

Jiaxu Guo et al.

Jiaxu Guo et al.

Viewed

Total article views: 677 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
559 101 17 677 5 4
  • HTML: 559
  • PDF: 101
  • XML: 17
  • Total: 677
  • BibTeX: 5
  • EndNote: 4
Views and downloads (calculated since 19 Dec 2022)
Cumulative views and downloads (calculated since 19 Dec 2022)

Viewed (geographical distribution)

Total article views: 648 (including HTML, PDF, and XML) Thereof 648 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 24 Sep 2023
Download
Short summary
To further improve the efficiency of experiments using SCAM, we train a neural network-based surrogate model to support large-scale sensitivity analysis and tuning of combinations of multiple parameters. Using a hybrid method, we explore the joint sensitivity of multi-parameter combinations to typical cases and identify the most sensitive three-parameter combination out of eleven, and perform a tuning process that reduces the error of precipitation in these cases.