Upload
others
View
5
Download
0
Embed Size (px)
Citation preview
Nonlinear parametrization of geological fields usingGenerative Adversarial Networks (GANs)
Shing Chan, PhD student,Ahmed H. Elsheikh
School of Energy, Geoscience, Infrastructure and Society,Heriot-Watt University, United Kingdom.
MASCOT–NUM 2019 annual conference,IFPEN Rueil–Malmaison – France,
18–20 March 2019
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 1 / 46
Outline
1 Introduction and background
2 The unreasonable effectiveness of deep neural networks
3 Generative adversarial networks
4 Numerical evaluation for subsurface flow problems
5 Conclusions and outlook
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 2 / 46
Outline
1 Introduction and background
2 The unreasonable effectiveness of deep neural networks
3 Generative adversarial networks
4 Numerical evaluation for subsurface flow problems
5 Conclusions and outlook
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 3 / 46
Uncertainty propagation for subsurface reservoir models
Subsurface reservoir modelsReservoir management
• Computationally expensive models Predictive modeling for decision support – challenges
Computationally expensive
Ahmed H. Elsheikh (HWU, Edinburgh, UK) 9–11th of April 2014 4 / 23
Ahmed Elsheikh (HWU, UK) Data-Driven MsFV May 2017 4 / 37
Monte-Carlo approach for uncertainty propagationBackground: Uncertainty Quantification
Modelsettings
Ahmed Elsheikh (HWU, UK) Data-Driven MsFV May 2017 5 / 37
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 4 / 46
Forward model – Two-phase porous media flow
Combine mass conservation and Darcy’s law
−∇ · (Kλt(Sw )∇p) = q → Pressure Equation
Water saturation equation only: ( So + Sw = 1)
φ∂Sw∂t
+∇ · (f (Sw ) vt) =Qw
ρw→ Saturation Equation
λw (Sw ) =(Snw )2
µw, λo(Sw ) =
(1− Snw )2
µo, Snw =
Sw − Swc
1− Sor − Swc
Swc , Sor is the irreducible saturationsµw , µo are the fluid viscosities, ρw , ρo are the fluid densitiesf (Sw ) = λw/λt is the fractional flow functionsK is the permeability tensor,φ is the porosityp = po = pw is the pressureq = Qo/ρo + Qw/ρw is the normalized source or sink term
λt(Sw ) = λw (Sw ) + λo(Sw ) is the total mobility
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 5 / 46
How to efficiently solveUQ, IUQ and robust optimization problems?
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 6 / 46
Learn an efficient emulator I
Regression models, commonly you start by dimension reduction thenbuild a regressor (non-intrusive PC, GPR, NN, RF)
I Elsheikh, Ahmed H; Hoteit, I; Wheeler, Mary F; Efficient Bayesianinference of subsurface flow models using nested sampling and sparsepolynomial chaos surrogates, CMAME 2014.
Learn a map from low fidelity models (fast to run) to high fidelitymodels (slow to run)
I Josset, Laureline; Demyanov, Vasily; Elsheikh, Ahmed H; Lunati,Ivan; Accelerating Monte Carlo Markov chains with proxy and errormodels, Computers and Geosciences 2015.
I Kopke, Corrina; Irving, James; Elsheikh, Ahmed H; Accounting formodel error in Bayesian solutions to hydrogeophysical inverse problemsusing a local basis approach, ADWR 2018.
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 7 / 46
Learn an efficient emulator II
Learn to upscale, e.g. exploit locality properties of some multi-scalemethods
I Chan, Shing; Elsheikh, Ahmed H; A machine learning approach forefficient uncertainty quantification using multiscale methods, JCP 2018.
Learn reduced order models, i.e. simplified dynamical system usingglobal basis functions (e.g. POD, DEIM, etc.)
I Kani, Nagoor J; Elsheikh, Ahmed H; Reduced-order modeling ofsubsurface multi-phase flow models using deep residual recurrent neuralnetworks, TIPM 2019.
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 8 / 46
Learn a compact representationof stochastic fields
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 9 / 46
Parameterization using PCA
Given a set of realizations y1, y2, · · · , yN , (yi ∈ RM),let Y = [y1; y2; · · · ; yN ] and C = 1
N YYT (the covariance matrix).
PCA parametrization is:
y = UΛ1/2ξ
= ξ1
√λ1u1 + · · ·+ ξM
√λMuM
where:
U = [u1; · · · ; uM ] matrix of eigenvectors of C
Λ = diag(λ1, · · · , λM) diagonal matrix of eigenvalues of C
ξ = (ξ1, · · · , ξM) is a noise vector
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 10 / 46
Parameterization using PCA
10
20
30
40
510
1520
2530
3540
−0.4
−0.3
−0.2
−0.1
0
0.1
0.2
0.3
0.4
10
20
30
40
510
1520
2530
3540
−0.4
−0.3
−0.2
−0.1
0
0.1
0.2
0.3
0.4
10
20
30
40
510
1520
2530
3540
−0.3
−0.2
−0.1
0
0.1
0.2
0.3
10
20
30
40
510
1520
2530
3540
−0.05
0
0.05
0.1
0.15
0.2
0.25
0.3
10
20
30
40
510
1520
2530
3540
−0.25
−0.2
−0.15
−0.1
−0.05
0
0.05
0.1
0.15
0.2
0.25
10
20
30
40
510
1520
2530
3540
−0.25
−0.2
−0.15
−0.1
−0.05
0
0.05
0.1
0.15
0.2
0.25
10
20
30
40
510
1520
2530
3540
−0.25
−0.2
−0.15
−0.1
−0.05
0
0.05
0.1
0.15
0.2
0.25
10
20
30
40
510
1520
2530
3540
−0.25
−0.2
−0.15
−0.1
−0.05
0
0.05
0.1
0.15
0.2
0.25
First 8 eigen modes of the search space
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 11 / 46
Parametrization – General form
Let y ∈ RM be the random vector representing an unknown, and z ∈ Rm,z ∼ pz a noise vector with a known distribution pz (e.g. uniform, normal,etc.). We want:
A functional relationship y = G (z)
m� M
G fast evaluation.
G differentiable wrt. z
Standard Parametrization approaches:
G (z) := Az + b (e.g. PCA)
G (z) := Φ−1(Az + b) (e.g. kPCA)
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 12 / 46
PCA is so great: Do we need another parametrizationtechniques?
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 13 / 46
MultiPoint Geostatistics representation
(a) Conceptual images
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 14 / 46
Geological model parametrization
Well data,analogs,TIs, etc.
MPS · · ·
· · ·1 2 N
Gbuild
G (z)z · · ·
z ∼ pz
low dimension high dimension
“synthetic” samples
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 15 / 46
Outline
1 Introduction and background
2 The unreasonable effectiveness of deep neural networks
3 Generative adversarial networks
4 Numerical evaluation for subsurface flow problems
5 Conclusions and outlook
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 16 / 46
Generation using Convolutional Neural Networks (CNN)
G (z) := fn(fn−1(· · · (f1(z)))), where fl(x) = σl(Wlx + bl)
Figure from “Unsupervised representation learning with deep convolutional generativeadversarial networks”, Radford et. al., 2016
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 17 / 46
Likelihood as a classification problem (another CNN)
D(y) := fn(fn−1(· · · (f1(y)))), where fl(x) = σl(Uly + cl)
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 18 / 46
NN concepts: Convolutional Neural Networks
fl(x) = σl(Wlx + bl)u1
u2
u3
u4
v1
v2
v3
w11
w21
w34
W =
w11 w12 w13 w14
w21 w22 w23 w24
w31 w32 w33 w34
(a) A fully connected layer.
u1
u2
u3
u4
v1
v2
v3
w1
w2
w1
w2
w1
w2
W =
w1 w2 0 00 w1 w2 00 0 w1 w2
(b) A convolutional layer.
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 19 / 46
NN concepts: Loss function for classification
In binary classification, where the number of classes equals 2, themean square error could be simply defined as:
1
n
n∑1
(yi − pi )2
I y is a binary indicator (0 or 1) depending on the class label cI p is the predicted probability that an observation o is of class c
Another smoother loss function is the cross-entropy loss:
−n∑1
(yi log(pi ))
I log - the natural log
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 20 / 46
NN concepts: Cross-entropy loss function
In binary classification, where the number of classes equals 2, thecross-entropy loss function can be calculated as:
−(∑
c=1
y log(p) +∑c=0
(1− y) ∗ log(1− p)
)where:
I log - the natural logI y is a binary indicator (0 or 1) depending on the class label cI p is the predicted probability that an observation o is of class c
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 21 / 46
Outline
1 Introduction and background
2 The unreasonable effectiveness of deep neural networks
3 Generative adversarial networks
4 Numerical evaluation for subsurface flow problems
5 Conclusions and outlook
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 22 / 46
Generative adversarial networks (GAN)
A non-cooperative game between two players, the generator G and thediscriminator D
D(y)
G (z)z
Training Dataset
Real or Fake
feedback
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 23 / 46
Solving the minmax game
Let Dψ : Y → [0, 1] be the discriminator network parametrized by weightsψ to be determined. The training of the generator and discriminator usesthe following loss function:
L(ψ, θ) := Ey∼Py
logDψ(y) + Ey∼Pθ
log(1− Dψ(y)) (1)
where y = Gθ(z) ∼ Pθ. In effect, this loss is the classification score of thediscriminator, therefore we train Dψ to maximize L, and Gθ to minimize L:
minθ
maxψL(ψ, θ) (2)
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 24 / 46
Solving the minmax game
We know how to solve a min problem and a max problem by alternatingbetween:Step A:
minθ
maxψ{ E
y∼Py
logDψ(y) + Ey∼Pθ
log(1− Dψ(y))}
Step B:minθ
maxψ{ E
y∼Py
logDψ(y) + Ey∼Pθ
log(1− Dψ(y))}
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 25 / 46
Wasserstein GAN
Optimization of the GAN minmax game is unstable (objectivefunction is not monotonically decreasing)
Wasserstein formulation of GAN (WGAN) tries to optimize a differentloss function
L(ψ, θ) := Ey∼Py
Dψ(y)− Ey∼Pθ
Dψ(y) (3)
and a constraint in the search space of Dψ,
Training goal is to solve the following minmax problem:
minθ
maxψ:Dψ∈D
L(ψ, θ) (4)
where now Dψ : Y → R and D is the set of 1-Lipschitz functions(loosely enforced by constraining the weights ψ to a compact space,e.g. by clipping the values of the weights in an interval [−c , c]).
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 26 / 46
Dataset
(a) Semi-straight channels
(b) Meandering channels
(c) Conceptual images
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 27 / 46
Generated realizations: visual comparison
(a) Original realizations
(b) Realizations generated using GAN
(c) Realizations generated using PCA
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 28 / 46
Generated realizations: visual comparison
(a) Original realizations
(b) Realizations generated using GAN
(c) Realizations generated using PCAAhmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 29 / 46
Generated realizations: permeability histogram
0.0 0.5 1.00
5
10
15
20
25
freq
uenc
y
Data
0.0 0.5 1.00
5
10
15
20
25GAN
−2 −1 0 1 20.0
0.5
1.0
1.5PCA
log-permeability
(a) Semi-straight pattern
0.0 0.5 1.00
5
10
15
20
25
freq
uenc
y
Data
0.0 0.5 1.00
5
10
15
20
25GAN
−2 −1 0 1 20.0
0.5
1.0
1.5PCA
log-permeability
(b) Meandering pattern
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 30 / 46
Practical advantages of WGAN – Stability
0 10000 20000 30000
iteration
0.00
0.02
0.04
0.06
0.08W
GAN
loss
1
2
34
5 6 7
12
34
56
7
0 10000 20000 30000
iteration
0.00
0.05
0.10
0.15
0.20
0.25
GAN
loss
1 2
3
45
6
7
12
34
56
7
Convergence curves of a WGAN model (top) and a standard GAN model (bottom). On the right, we show samples along thetraining of the corresponding models. We see that GAN loss is uninformative regarding sample quality.
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 31 / 46
Outline
1 Introduction and background
2 The unreasonable effectiveness of deep neural networks
3 Generative adversarial networks
4 Numerical evaluation for subsurface flow problems
5 Conclusions and outlook
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 32 / 46
Forward uncertainty propagation study
Water injection in oil-filled reservoir
Quarter-five spot problem
Using 5000 permeability realizations, Estimate:
saturation statistics
water breakthrough times
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 33 / 46
Saturation statistics ,mean variance skewness kurtosis
0.0 0.5 1.0 0.00 0.08 0.16 −70 0 70 −10 2420 4850
(a) Statistics based on original realizationsmean variance skewness kurtosis
0.0 0.5 1.0 0.00 0.08 0.16 −70 0 70 −10 2420 4850
(b) Statistics based on GAN realizationsmean variance skewness kurtosis
0.0 0.5 1.0 0.00 0.08 0.16 −70 0 70 −10 2420 4850
(c) Statistics based on PCA realizations
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 34 / 46
Saturation statistics ,mean variance skewness kurtosis
0.0 0.5 1.0 0.00 0.05 0.10 −75 0 75 −10 2495 5000
(a) Statistics based on original realizationsmean variance skewness kurtosis
0.0 0.5 1.0 0.00 0.05 0.10 −75 0 75 −10 2495 5000
(b) Statistics based on GAN realizationsmean variance skewness kurtosis
0.0 0.5 1.0 0.00 0.05 0.10 −75 0 75 −10 2495 5000
(c) Statistics based on PCA realizations
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 35 / 46
Saturation statistics ,mean variance skewness kurtosis
0.0 0.5 1.0 0.00 0.05 0.10 −20 25 70 −10 2245 4500
(a) Statistics based on original realizationsmean variance skewness kurtosis
0.0 0.5 1.0 0.00 0.05 0.10 −20 25 70 −10 2245 4500
(b) Statistics based on GAN realizationsmean variance skewness kurtosis
0.0 0.5 1.0 0.00 0.05 0.10 −20 25 70 −10 2245 4500
(c) Statistics based on PCA realizations
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 36 / 46
Saturation histogram
0.0 0.5 1.00
1
2
3
4fr
eque
ncy
Data
0.0 0.5 1.0
GAN
0.0 0.5 1.0
PCA
saturation
(a) Semi-straight pattern
0.0 0.5 1.00.0
0.5
1.0
1.5
2.0
2.5
freq
uenc
y
Data
0.0 0.5 1.0
GAN
0.0 0.5 1.0
PCA
saturation
(b) Meandering pattern
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 37 / 46
Water breakthrough times ,
0.30 0.35 0.40 0.45 0.50 0.55 0.60 0.65 0.70
PVI
0
2
4
6
8
10
12
dens
ity
DataGANPCA
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 38 / 46
Water breakthrough times ,
0.30 0.35 0.40 0.45 0.50 0.55 0.60 0.65 0.70
PVI
0
2
4
6
8
10
12
14
dens
ity
DataGANPCA
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 39 / 46
Outline
1 Introduction and background
2 The unreasonable effectiveness of deep neural networks
3 Generative adversarial networks
4 Numerical evaluation for subsurface flow problems
5 Conclusions and outlook
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 40 / 46
Conclusions
Positive findings
WGANs generate visually plausible realizations
Generated realizations preserve the flow statistics
WGANs is far more stable than GANs
Introduced a powerful point based conditioning by learning aninference network (arXiv:1807.05207v1)
Warnings !!
Finding the equilibrium of the min-max game is challenging – (seearXiv preprint arXiv:1809.07748)
Fast evolving field and new methods are proposed everyday(GLO,VAE-GAN, etc.)
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 41 / 46
Thank you
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 42 / 46
Main References
Ian Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, DavidWarde-Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio.Generative adversarial nets. In Advances in neural information processingsystems, pages 26722680, 2014.
Martin Arjovsky, Soumith Chintala, and Leon Bottou. Wasserstein GAN.arXiv preprint arXiv:1701.07875, 2017.
Shing Chan and Ahmed H Elsheikh. Parametrization and generation ofgeological models with generative adversarial networks. arXiv preprintarXiv:1708.01810, 2017.
Shing Chan and Ahmed H Elsheikh. Parametric generation of conditionalgeological realizations using generative neural networks. arXiv preprintarXiv:1807.05207v1, 2018.
Shing Chan and Ahmed H Elsheikh. Exemplar-based synthesis of geologyusing kernel discrepancies and generative neural networks. arXivpreprint arXiv:1809.07748, 2018.
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 43 / 46
Main Reference
Lukas Mosser, Olivier Dubrule, and Martin J Blunt. Reconstruction ofthree-dimensional porous media using generative adversarial neuralnetworks. arXiv preprint arXiv:1704.03225, 2017.
Lukas Mosser, Olivier Dubrule, and Martin J Blunt. Conditioning ofthree-dimensional generative adversarial networks for pore andreservoir-scale models. arXiv preprint arXiv:1802.05622, 2018.
Eric Laloy, Romain Herault, Diederik Jacques, and Niklas Linde. Efficienttraining-image based geostatistical simulation and inversion using aspatial generative adversarial neural network. arXiv preprintarXiv:1708.04975, 2017.
Emilien Dupont, Tuanfeng Zhang, Peter Tilke, Lin Liang, and WilliamBailey. Generating realistic geology conditioned on physicalmeasurements with generative adversarial networks. arXiv preprintarXiv:1802.03065, 2018.
Ahmed Elsheikh (HWU, UK) parametrization using GANs 20 March, 2019 44 / 46