Preprints
https://doi.org/10.5194/gmd-2022-245
https://doi.org/10.5194/gmd-2022-245
Submitted as: development and technical paper
03 Jan 2023
Submitted as: development and technical paper | 03 Jan 2023
Status: this preprint is currently under review for the journal GMD.

ClinoformNet-1.0: stratigraphic forward modeling and deep learning for seismic clinoform delineation

Hui Gao1, Xinming Wu1, Jinyu Zhang2, Xiaoming Sun1, and Zhengfa Bi1 Hui Gao et al.
  • 1School of Earth and Space Sciences, University of Science and Technology of China, Hefei 230026, China
  • 2Bureau of Economic Geology, Jackson School of Geosciences, The University of Texas at Austin, Austin, Texas 78758, USA

Abstract. Deep learning has been widely used for various kinds of data mining tasks but not much for seismic stratigraphic interpretation due to the lack of labeled training datasets. We present a workflow to automatically generate numerous synthetic training datasets and take the seismic clinoform delineation as an example to demonstrate the effectiveness of using the synthetic datasets for training. In this workflow, we first perform stochastic stratigraphic forward modeling to generate numerous stratigraphic models of clinoform layers and corresponding porosity properties by randomly but properly choosing initial topographies, sea level curves, and thermal subsidence curves. We then convert the simulated stratigraphic models into impedance models by using the velocity-porosity relationship. We further simulate synthetic seismic data by convolving reflectivity models (converted from impedance models) with Ricker wavelets (with various peak frequencies) and adding real noise extracted from field seismic data. In this way, we automatically generate a total of 3000 diverse synthetic seismic data and the corresponding stratigraphic labels such as relative geologic time models and facies of clinoforms, which are all made publicly available. We use these synthetic datasets to train a modified encoder-decoder deep neural network for clinoform delineation in seismic data. Within the network, we apply a preconditioning process of structure-oriented smoothing to the feature maps of the decoder neural layers, which is helpful to avoid generating holes or outliers in the final output of clinoform delineation. Multiple 2D and 3D synthetic and field examples demonstrate that the network, trained with only synthetic datasets, works well to delineate clinoforms in seismic data with high accuracy and efficiency. Our workflow can be easily extended for other seismic stratigraphic interpretation tasks such as sequence boundary identification, synchronous horizon extraction, shoreline trajectory identification and so on.

Hui Gao et al.

Status: open (until 28 Feb 2023)

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
  • RC1: 'Comment on gmd-2022-245', Xuesong Ding, 17 Jan 2023 reply
  • RC2: 'Comment on gmd-2022-245', Mark Jessell, 23 Jan 2023 reply

Hui Gao et al.

Hui Gao et al.

Viewed

Total article views: 265 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
196 60 9 265 2 3
  • HTML: 196
  • PDF: 60
  • XML: 9
  • Total: 265
  • BibTeX: 2
  • EndNote: 3
Views and downloads (calculated since 03 Jan 2023)
Cumulative views and downloads (calculated since 03 Jan 2023)

Viewed (geographical distribution)

Total article views: 256 (including HTML, PDF, and XML) Thereof 256 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 28 Jan 2023
Download
Short summary
We propose a workflow to automatically generate numerous synthetic seismic data and corresponding stratigraphic labels (e.g., clinoform facies, relative geologic time, and synchronous horizons) by geological and geophysical forward modeling. Trained with only synthetic datasets, our network works well to accurately and efficiently predict clinoform facies in multiple 2D and 3D field seismic data. Such a workflow can be easily extended for other geological and geophysical scenarios in the future.