Preprints
https://doi.org/10.5194/gmd-2021-405
https://doi.org/10.5194/gmd-2021-405
Submitted as: model description paper
22 Dec 2021
Submitted as: model description paper | 22 Dec 2021
Status: a revised version of this preprint was accepted for the journal GMD.

Rad-cGAN v1.0: Radar-based precipitation nowcasting model with conditional Generative Adversarial Networks for multiple domains

Suyeon Choi and Yeonjoo Kim Suyeon Choi and Yeonjoo Kim
  • Department of Civil and Environmental Engineering, Yonsei University, Seoul 03722, Korea

Abstract. Numerical weather prediction models and probabilistic extrapolation methods using radar images have been widely used for precipitation nowcasting. Recently, machine-learning-based precipitation nowcasting models have also been actively developed for relatively short-term precipitation predictions. This study aimed to develop a radar-based precipitation nowcasting model using an advanced machine learning technique, conditional generative adversarial network (cGAN), which shows high performance in image generation tasks. The cGAN-based precipitation nowcasting model, named Rad-cGAN, developed in this study was trained with a radar reflectivity map of the Soyang-gang Dam region in South Korea with a spatial domain of 128 × 128 km, spatial resolution of 1 km, and temporal resolution of 10 min. The model performance was evaluated using previously developed machine-learning-based precipitation nowcasting models, namely convolutional long short-term memory (ConvLSTM) and U-Net, as well as the baseline Eulerian persistence model. We demonstrated that Rad-cGAN outperformed other models not only for the chosen site but also for the entire domain across the Soyang-gang Dam region. Additionally, the proposed model maintained good performance even with lead times up to 80 min based on the critical success index at the intensity threshold of 0.1 mm h−1, while RainNet and ConvLSTM achieved lead times of 70 and 40 min, respectively. We also demonstrated the successful implementation of the transfer learning technique to efficiently train model with the data from other dam regions in South Korea, such as the Andong and Chungju Dam regions. We used pre-trained model, which was completely trained in the Soyang-gang Dam region. This study confirms that Rad-cGAN can be successfully applied to precipitation nowcasting with longer lead times, and using the transfer learning approach it shows good performance in regions other than the originally trained region.

Suyeon Choi and Yeonjoo Kim

Status: final response (author comments only)

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
  • RC1: 'Comment on gmd-2021-405', Anonymous Referee #1, 18 Jan 2022
  • RC2: 'Comment on gmd-2021-405', Anonymous Referee #2, 24 Jan 2022
  • RC3: 'Comment on gmd-2021-405', Anonymous Referee #3, 26 Jan 2022
  • AC1: 'Responses to RC1, RC2 and RC3', Yeonjoo Kim, 13 Apr 2022

Suyeon Choi and Yeonjoo Kim

Suyeon Choi and Yeonjoo Kim

Viewed

Total article views: 927 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
693 206 28 927 10 7
  • HTML: 693
  • PDF: 206
  • XML: 28
  • Total: 927
  • BibTeX: 10
  • EndNote: 7
Views and downloads (calculated since 22 Dec 2021)
Cumulative views and downloads (calculated since 22 Dec 2021)

Viewed (geographical distribution)

Total article views: 857 (including HTML, PDF, and XML) Thereof 857 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 27 Jun 2022
Download
Short summary
This study aimed to develop a radar-based precipitation nowcasting model using an advanced machine learning technique, conditional generative adversarial network. The precipitation nowcasting model developed in this study was trained with a radar reflectivity map of the Soyang-gang Dam region in South Korea. We showed that the model can be successfully applied to precipitation nowcasting with longer lead times, and using the transfer learning approach it shows good performance in other regions.