Preprints
https://doi.org/10.5194/gmd-2022-264
https://doi.org/10.5194/gmd-2022-264
Submitted as: development and technical paper
 | 
19 Dec 2022
Submitted as: development and technical paper |  | 19 Dec 2022
Status: a revised version of this preprint was accepted for the journal GMD and is expected to appear here in due course.

A learning-based method for efficient large-scale sensitivity analysis and tuning of single column atmosphere model (SCAM)

Jiaxu Guo, Yidan Xu, Haohuan Fu, Wei Xue, Lanning Wang, Lin Gan, Xianwei Wu, Liang Hu, Gaochao Xu, and Xilong Che

Abstract. The Single Column Atmospheric Model (SCAM) is an essential tool for analyzing and improving the physics schemes of CAM. Although it already largely reduces the compute cost from a complete CAM, the exponentially-growing parameter space makes a combined analysis or tuning of multiple parameters difficult. In this paper, we propose a hybrid framework that combines parallel execution and a learning-based surrogate model, to support large-scale sensitivity analysis (SA) and tuning of combinations of multiple parameters. We start with a workflow (with modifications to the original SCAM) to support the execution and assembly of a large number of sampling, sensitivity analysis, and tuning tasks. By reusing the 3,840 instances with the variation of 11 parameters, we train a neural network (NN) based surrogate model that achieves both accuracy and efficiency (with the computational cost reduced by several orders of magnitude). The improved balance between cost and accuracy enables us to integrate NN-based grid search into the traditional optimization methods to achieve better optimization results with fewer compute cycles. Using such a hybrid framework, we explore the joint sensitivity of multi-parameter combinations to multiple cases using a set of three parameters, identify the most sensitive three-parameter combination out of eleven, and perform a tuning process that reduces the error of precipitation by 5 % to 15 % in different cases.

Jiaxu Guo, Yidan Xu, Haohuan Fu, Wei Xue, Lanning Wang, Lin Gan, Xianwei Wu, Liang Hu, Gaochao Xu, and Xilong Che

Status: closed

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
  • RC1: 'Comment on gmd-2022-264', Anonymous Referee #1, 14 Jan 2023
    • AC1: 'Reply on RC1', Jiaxu Guo, 18 Apr 2023
  • RC2: 'Comment on gmd-2022-264', Anonymous Referee #2, 17 Mar 2023
    • AC2: 'Reply on RC2', Jiaxu Guo, 18 Apr 2023
  • AC3: 'Final author comment on gmd-2022-264', Jiaxu Guo, 18 Apr 2023

Status: closed

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
  • RC1: 'Comment on gmd-2022-264', Anonymous Referee #1, 14 Jan 2023
    • AC1: 'Reply on RC1', Jiaxu Guo, 18 Apr 2023
  • RC2: 'Comment on gmd-2022-264', Anonymous Referee #2, 17 Mar 2023
    • AC2: 'Reply on RC2', Jiaxu Guo, 18 Apr 2023
  • AC3: 'Final author comment on gmd-2022-264', Jiaxu Guo, 18 Apr 2023
Jiaxu Guo, Yidan Xu, Haohuan Fu, Wei Xue, Lanning Wang, Lin Gan, Xianwei Wu, Liang Hu, Gaochao Xu, and Xilong Che
Jiaxu Guo, Yidan Xu, Haohuan Fu, Wei Xue, Lanning Wang, Lin Gan, Xianwei Wu, Liang Hu, Gaochao Xu, and Xilong Che

Viewed

Total article views: 844 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
658 157 29 844 16 18
  • HTML: 658
  • PDF: 157
  • XML: 29
  • Total: 844
  • BibTeX: 16
  • EndNote: 18
Views and downloads (calculated since 19 Dec 2022)
Cumulative views and downloads (calculated since 19 Dec 2022)

Viewed (geographical distribution)

Total article views: 810 (including HTML, PDF, and XML) Thereof 810 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 03 Apr 2024
Download
Short summary
To further improve the efficiency of experiments using SCAM, we train a neural network-based surrogate model to support large-scale sensitivity analysis and tuning of combinations of multiple parameters. Using a hybrid method, we explore the joint sensitivity of multi-parameter combinations to typical cases and identify the most sensitive three-parameter combination out of eleven, and perform a tuning process that reduces the error of precipitation in these cases.