Deep-Learning Inversion to Efficiently Handle Big-Data Assimilation
Application to Seismic History Matching
More Info
expand_more
Abstract
Seismic history matching can play a key role in geological characterization and uncertainty quantification. However, challenges related to intensive computational demands and complexity restricts its application in many practical cases. This paper presents a conceptual deep-learning-based framework fully deployed in the popular Pytorch architecture to accelerate the seismic history matching. We introduce a surrogate model based on a deep convolutional neural network with a stack of dense blocks, specifically a conditional deep convolutional autoencoder-decoder architecture (cDCAE). The surrogate model allows us to fully deploy data assimilation algorithms in Pytorch architecture and hence to easily make full use of the efficient computing units, in particular GPU’s for the matrix-matrix and matrix-vector multiplications. The feature of built-in automatic differentiation (AD) provided by Pytorch also makes is possible to evaluate gradient information efficiently in a parallel manner. Furthermore, it has been acknowledged to benefit from the deep learning practice of using stochastic gradient (SG) optimizers, e.g., Adam, instead of full gradient optimizers, e.g., Quasi-Newton, as is most common in conventional big-data assimilation. The proposed framework is tested on a benchmark 3D model in the context of petroleum engineering. This surrogate model is demonstrated to be capable of accurately predicting the quantity of interest, e.g., dynamic saturation maps for new geological realizations. Assessments demonstrating high surrogate-model accuracy are presented for an ensemble of test models. The robustness and dramatic speedup provided by the surrogate model suggests that it can help facilitate the application of large-scale seismic history matching.