Post on 13-Aug-2021
A Deep-Learning-Based Geological Parameterization Method
for History Matching
Yimin Liu Wenyue Sun Louis J. Durlofsky
SESAAI Meeting
March 30, 2018
Outlineq Background
q Deep-learning-based neural style transfer
q Convolutional neural network PCA (CNN-PCA) for geological Parameterization
q Parameterization results
q History matching results
q Conclusions and future work
2
History Matching
3
History Matching Problem
ππ«π π¦π’π§π
(
ππ π π β π π¨ππ¬ π»πΆ234(π (π) β π π¨ππ¬)
+ππ πβ π8 π»πΆ934(π βπ8 )
:
4
q Decision variable: π - model parameters
q Challenges:
Β§ π can be high dimensional
Β§ π should preserve geology
q Solution: map π onto lower dimensional space
Log permeability
Reparameterization for History Matching
5
q Map π to a new variable π
q Favorable properties:
Β§ dim π βͺ dim(π)
Β§ π is uncorrelated
Β§ π@ preserves geological realism
π β π@ = π(π)
ππ«π π¦π’π§π
(
ππ π π β π π¨ππ¬ π»πΆ234(π (π) β π π¨ππ¬)
+ππ π β πD
π»πΆπ34(π β πD)
:
q Optimization variable: π
Principal Component Analysis (PCA)
q PCA: Oliver (1996), Sarma et al. (2006)
Γ Generate πF realizations using geostatistical algorithm
Γ Perform SVD and reduce dimension
Γ Generate new realization:
6
ππ©ππ = πΌππ¦ππ/πππ +π8
l << Nπͺ (Nπͺ:#ofgridblocks)
π =π
πF β π[ππ β π8 ,ππ β π8 ,β¦ ,π _Μ β π8 ]
π = πΌπ¦π/ππ½π» β πΌππ¦ππ/ππ½ππ»
ππ~π΅(π, π°)
PCA Representation for New Realizationsq Works well when π follows Gaussian distribution
7
PCA π = 70SGEMS Nπͺ = 3600
ππ©ππ = πΌππ¦ππ/πππ +π8 ππ~π΅(π, π°)
Non
-Gau
ssia
nG
auss
ian
PCA π = 70SGEMS Nπͺ = 3600
Optimization-based PCA (O-PCA)q Optimization-based PCA: Vo and Durlofsky (2014, 2015)
Γ Formulate PCA as an optimization problem with regularizationΓ Objective: minimize difference to ππ©ππ and original histogram Γ Essentially post-process ππ©ππ with point-wise mapping
8
O-PCA π = 70SGEMS Nπͺ = 3600
πfghi= ππ«π π¦π’π§π
πghi(ππ) β π ππ+ Ξ³ππ»(π β π) π₯m β [π₯o, π₯p]
(figures from Vo and Durlofsky, 2014)
Limitations of O-PCAq Underlying PCA honors only two-point correlations
q O-PCA point-wise mapping honors single point statistics
q Difficult to preserve multiple point statistics
9
SGeMS Random Real. 1 O-PCA Random Real. 1 O-PCA Random Real. 2
Unconditional Realizations
Neural Style Transfer
10
Content Image
Style Image
Output Image
q Recent work in deep learning and computer visionq Gatys et al. (2016), Johnson et al. (2016)q Enables transfer of photo into artistic style
(images from Johnson: github.com/jcjohnson/fast-neural-style)
Neural Style Transfer
11
q Recent work in deep learning and computer visionq Gatys et al. (2016), Johnson et al. (2016)q Enables transfer of photo into artistic style
Content ImagePCA Model
Style Image - Training Image
Output ImagePost-processed model
Neural Style Transfer Algorithm
12
q O: output image, C: content image, S: style image
q πΏhfrstrs π, πΆ : difference between output and content image
q πΏvswxt π, π : difference between output and style image
q Output image preservesΓ Content (objects) in the content image
Γ Style (color, texture) in the style image
Γ Object and texture are characterized by multipoint statistics
π = argmin|
{πΏhfrstrs π, πΆ + πΎπΏvswxt π, π }Content Loss Style Loss
Analogy to O-PCAq Neural style transfer algorithm
q O-PCA
π = argmin|
πΏhfrstrs π,πΆ + πΎπΏvswxt π, πContent Loss Style Loss
13
πfghi= argminπ
πghi β π ππ + πΎππ»(π β π)
Style LossContent Loss
q O-PCA losses based on point-wise difference
q Neural style transfer based on high-level features extracted from deep convolutional neural network (CNN)
Artificial Neural Networkq Nonlinear function π¦ = π(π₯) with multiple layers of neurons
q Linear function: ποΏ½βo34 + πq Nonlinear activation function π οΏ½ , e.g., ReLU, sigmoid
14
βo = π(ποΏ½βo34 + π)
q Convolutional neural networks (CNN):Γ Convolutional layersΓ Suitable for image input
Neural Style Transfer Algorithm
15
πΏhοΏ½οΏ½οΏ½οΏ½οΏ½οΏ½ π, πΆ =οΏ½1
πoπ·oπΉo π β πΉo πΆ οΏ½
οΏ½
o
πΏvswxt π, π = οΏ½1πxοΏ½ πΊo π β πΊo π οΏ½
οΏ½
o
q Limitations of neural style transfer algorithm: Γ Need to solve optimization onlineΓ Derivatives of the output image w.r.t input images not clear
Fast Neural Style Transfer Algorithm
q Train a transform net, Johnson et al. (2016)Γ Hour glass shape deep CNN with same input and output size
q Same loss function, optimize parameters in the transform net
16
Fast Neural Style Transfer Algorithmq Construct PCA with 1000 SGeMS realizationsq Train the model transform net on 3000 random PCA modelsq Training takes 3 minutes on 1 GPU (NVIDIA Telsa K80)q Final step: threshold at 0.5
17
Outlineq Background
q Deep-learning-based neural style transfer
q Convolutional neural network PCA (CNN-PCA) for geological parameterization
q Parameterization results
q History matching results
q Conclusions and future work
18
Unconditional Binary System
19
q Binary facies model:
q Training image size: 250 x 250
q Model size: 60 x 60, ποΏ½ = 3600
q No hard data
q Goal: low-dimensional representation
q Reduced dimension: π = 70
Training image
One model realization
Unconditional Binary System
20
PCA Real. 1
PCA Real. 2
O-PCA Real. 1
O-PCA Real. 2 CNN-PCA Real. 2
CNN-PCA Real. 1
Conditional Binary System
21
q Binary facies model
q Training image size: 250 x 250
q Model size: 60 x 60, ποΏ½ = 3600
q Hard data at 16 well locations
q Reduced dimension: π = 70
q 200 new random realizations
q Additional hard-data loss
Training image
One model realization
argmin|
{πΏhfrstrs π, πΆ + πΎ4πΏvswxt π,π
+πΎοΏ½πΏοΏ½isi(π)}
Conditional Binary System
PCA Real. 1
PCA Real. 2
O-PCA Real. 1
O-PCA Real. 2
CNN-PCA Real. 1
CNN-PCA Real. 2
22
q All 200 CNN-PCA models match all hard data
Outlineq Background
q Deep-learning-based neural style transfer
q Convolutional neural network PCA (CNN-PCA) for geological parameterization
q Parameterization results
q History matching results
q Conclusions and future work
23
History Match Conditional Modelq Oil-water, 60 x 60 grid
q 2 injectors, 2 producers, BHP controlled
q ksand = 2000 md, kmud = 0.02 md
q Data: production and injection rates for 1000 days every 100days
q Number of data: ποΏ½ = 80
q Goal: 30 RML posterior models
q Optimizer: PSO-MADS
24
Injector Producer
Permeability Estimation
25
True model
One O-PCA prior model O-PCA posterior model
One CNN-PCA prior model CNN-PCA posterior model
PROD-1 Water Rate
26
O-PCA Prior models O-PCA Posterior Models
CNN-PCA Prior models CNN-PCA Posterior Models
Infill Well Prediction
infill well locations
q Two infill wells P3, P4q Drilled at 1000 daysq Prediction to 2000 days
27
Permeability Estimation
28
True model
O-PCA #1 O-PCA #2
CNN-PCA #1 CNN-PCA #2
O-PCA #3
CNN-PCA #3
Infill Well PROD-4 Prediction
29
O-PCA Water Rate CNN-PCA Water Rate
24 curves 13 curves
Field Prediction
30
O-PCA Oil Rate CNN-PCA Oil Rate
15 curves 6 curves
Conclusions
q Developed CNN-PCA by combining deep-learning-based neural style transfer algorithm with PCA
q CNN-PCA better preserves channel geometry compared to O-PCA
q CNN-PCA history matching solutions provide more accurate prediction for infill wells
31
Future Work
32
q Extend CNN-PCA to bimodal, three-facies and three dimensional reservoir models
q Apply CNN-PCA with gradient-based history matching
q Implement CNN-PCA treatment with ensemble-based history matching methods
Acknowledgmentsq Stanford CS 231N course
Γ Instructors: Justin Johnson, Serena Yeung, Fei-fei LiΓ Course project teammate: Yuanlin Wen (Google)
q Open source code:Γ O-PCA: Hai Xuan Vo (Chevron)Γ Neural style transfer (PyTorch): Hang ZhangΓ Fast neural style transfer (PyTorch): Abhishek Kadian
q PSO-MADS: Obi Isebor
q Stanford CEES
33
34
Thank You!Q & A
Backup Slides
35
Convolutional Layersq Each neuron connects to a local region in previous layerq Convolution: No filters π€o sliding through previous layersq Convolutional layer πo channels:
Γ Each channel is called a feature map of π·o neuronsΓ All channels form a feature matrix πΉo β π ^οΏ½Γ οΏ½
36
Convolutional Neural Networkq Neural network consists of mainly convolutional layersq Nonlinear layers:
Γ Activation layer, e.g., ReLU π π₯ = max(0,π₯)Γ Pooling layer: down sampling to reduce dimension
q Image classification:Γ Input: image of size π»ΓπΓ3Γ Output: score for predefined image classes
37
VGG-16 Deep CNN (Simonyan and Zisserman, 2015)
Feature Matrix and Gram Matrix
38
q Content representation: feature matrix πΉo β π ^οΏ½Γ οΏ½
q Style representation: Gram matrix πΊo = πΉoπΉoοΏ½/(πoπ·o)οΏ½
* Images modified from Gatys et al. (2016)
CNN-PCA for Reparameterizationq Construct PCA with πF SGeMS realizations
q Post-process ππ©ππ with fast neural style transfer algorithm
39
ππ©ππ = πΌππ¦ππ/πππ +π8 ππ~π΅(π, π°)