Leveraging domain adaptation for efficient seismic denoising

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The selection of training data for deep learning procedures dictates both the neural network's performance and its applicability to further datasets. Particularly in seismic applications, the selection is non-trivial with the common approaches of manually labelling field data or generating synthetic data both exhibiting severe limitations. The former in its inability to outperform conventional approaches required for the label generation and the later in its inability to properly represent future application data. Domain adaptation, through input features and label transformations, offers the potential to leverage on the benefits of both these approaches while reducing their drawbacks. In this work we illustrate how vital information from field data can be incorporated into a training procedure on synthetic data with the trained network successfully applied on the field data afterwards, despite large differences between the training and inference, i.e., synthetic and field, datasets. Furthermore, we illustrate how an inverse correlation procedure can be incorporated into the training procedure in an attempt to maintain the original wavefield properties.
Original languageEnglish (US)
Title of host publicationEnergy in Data Conference, Austin, Texas, 20–23 February 2022
PublisherEnergy in Data
DOIs
StatePublished - May 3 2022

Bibliographical note

KAUST Repository Item: Exported on 2022-05-10
Acknowledgements: The authors thank Prof. M. Ravasi, Dr O. Ovcharenko and Dr H. Wang for insightful discussions, as well as the wider KAUST Seismic Wave Analysis Group. For computer time, this research used the resources of the Supercomputing Laboratory at King Abdullah University of Science & Technology (KAUST) in Thuwal, Saudi Arabia

Fingerprint

Dive into the research topics of 'Leveraging domain adaptation for efficient seismic denoising'. Together they form a unique fingerprint.

Cite this