Style transfer for generation of realistically textured subsurface models

Oleg Ovcharenko, Vladimir Kazei, Daniel Peter, Tariq Ali Alkhalifah

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Scopus citations

Abstract

Training datasets consisting of numerous pairs of subsurface models and target variables are essential for building machine learning solutions for geophysical applications. We apply an iterative style transfer approach from image processing to produce realistically textured subsurface models based on synthetic prior models. The key idea of style transfer is that content and texture representations within a convolutional neural network are, to some extent, separable. Thus, a style from one image can be transferred to match the content from another image. We demonstrate examples where realistically random models are stylized to mimic texture patterns from Marmousi II and a section from the BP 2004 benchmark velocity models.
Original languageEnglish (US)
Title of host publicationSEG Technical Program Expanded Abstracts 2019
PublisherSociety of Exploration Geophysicists
Pages2393-2397
Number of pages5
DOIs
StatePublished - Aug 10 2019

Bibliographical note

KAUST Repository Item: Exported on 2021-02-25
Acknowledgements: We thank Kevin Zakka for his implementation of the Gatys et al. (2015) algorithm (https://github.com/kevinzakka/style-transfer). The research reported in this publication was supported by funding from King Abdullah University of Science and Technology (KAUST), Thuwal, 23955-6900, Saudi Arabia.

Fingerprint

Dive into the research topics of 'Style transfer for generation of realistically textured subsurface models'. Together they form a unique fingerprint.

Cite this