Sequential Simulations of Mixed Discrete-Continuous Properties: Sequential Gaussian Mixture Simulation
Dario Grana, Tapan Mukerji, Laura Dovera, and Ernesto Della Rossa
Abstract We present here a method for generating realizations of the posterior probability density function of a Gaussian Mixture linear inverse problem in the combined discrete-continuous case. This task is achieved by extending the sequen- tial simulations method to the mixed discrete-continuous problem. The sequential approach allows us to generate a Gaussian Mixture random field that honors the covariance functions of the continuous property and the available observed data. The traditional inverse theory results, well known for the Gaussian case, are first summarized for Gaussian Mixture models: in particular the analytical expression for means, covariance matrices, and weights of the conditional probability density function are derived. However, the computation of the weights of the conditional distribution requires the evaluation of the probability density function values of a multivariate Gaussian distribution, at each conditioning point. As an alternative so- lution of the Bayesian inverse Gaussian Mixture problem, we then introduce the sequential approach to inverse problems and extend it to the Gaussian Mixture case. The Sequential Gaussian Mixture Simulation (SGMixSim) approach is presented as a particular case of the linear inverse Gaussian Mixture problem, where the linear operator is the identity. Similar to the Gaussian case, in Sequential Gaussian Mixture Simulation the means and the covariance matrices of the conditional distribution at a given point correspond to the kriging estimate, component by component, of the mixture. Furthermore, Sequential Gaussian Mixture Simulation can be conditioned by secondary information to account for non-stationarity. Examples of applications
D. Grana () · T. Mukerji Stanford University, 397 Panama Mall, Stanford, CA 94305, USA e-mail: [email protected] T. Mukerji e-mail: [email protected]
L. Dovera · E. Della Rossa Eni E&P, Via Emilia 1, Milan 20097, Italy L. Dovera e-mail: [email protected] E. Della Rossa e-mail: [email protected]
P. Abrahamsen et al. (eds.), Geostatistics Oslo 2012, 239 Quantitative Geology and Geostatistics 17, DOI 10.1007/978-94-007-4153-9_19, © Springer Science+Business Media Dordrecht 2012 240 D. Grana et al. with synthetic and real data, are presented in the reservoir modeling domain where realizations of facies distribution and reservoir properties, such as porosity or net- to-gross, are obtained using Sequential Gaussian Mixture Simulation approach. In these examples, reservoir properties are assumed to be distributed as a Gaussian Mixture model. In particular, reservoir properties are Gaussian within each facies, and the weights of the mixture are identified with the point-wise probability of the facies.
1 Introduction
Inverse problems are common in many different domains such as physics, engineer- ing, and earth sciences. In general, solving an inverse problem consists of estimating the model parameters given a set of observed data. The operator that links the model and the data can be linear or nonlinear. In the linear case, estimation techniques generally provide smoothed solutions. Kriging, for example, provides the best estimate of the model in the least-squares sense. Simple kriging is in fact identical to a linear Gaussian inverse problem where the linear operator is the identity, with the estimation of posterior mean and covari- ance matrices with direct observations of the model space. Monte Carlo methods can be applied as well to solve inverse problems [12] in a Bayesian framework to sample from the posterior; but standard sampling methodologies can be ineffi- cient in practical applications. Sequential simulations have been introduced in geo- statistics to generate high resolution models and provide a number of realizations of the posterior probability function honoring both prior information and the observed values. References [3] and [6] give detailed descriptions of kriging and sequential simulation methods. Reference [8] proposes a methodology that applies sequential simulations to linear Gaussian inverse problems to incorporate the prior information on the model and honor the observed data. We propose here to extend the approach of [8] to the Gaussian Mixture case. Gaussian Mixture models are convex combinations of Gaussian components that can be used to describe the multi-modal behavior of the model and the data. Ref- erence [14], for instance, introduces Gaussian Mixture distributions in multivariate nonlinear regression modeling; while [10] proposes a mixture discriminant analy- sis as an extension of linear discriminant analysis by using Gaussian Mixtures and Expectation-Maximization algorithm [11]. Gaussian Mixture models are common in statistics (see, for example, [9] and [2]) and they have been used in different do- mains: digital signal processing [13] and [5], engineering [1], geophysics [7], and reservoir history matching [4]. In this paper we first present the extension of the traditional results valid in the Gaussian case to the Gaussian Mixture case; we then propose the sequential ap- proach to linear inverse problems under the assumption of Gaussian Mixture distri- bution; and we finally show some examples of applications in reservoir modeling. If the linear operator is the identity, then the methodology provides an extension of the traditional Sequential Gaussian Simulation (SGSim, see [3], and [6])toanew Sequential Gaussian Mixture Simulation 241 methodology that we call Sequential Gaussian Mixture Simulation (SGMixSim). The applications we propose refer to mixed discrete-continuous problems of reser- voir modeling and they provide, as main result, sets of models of reservoir facies and porosity. The key point of the application is that we identify the weights of the Gaussian Mixture describing the continuous random variable (porosity) with the probability of the reservoir facies (discrete variable).
2 Theory: Linearized Gaussian Mixture Inversion
In this section we provide the main propositions of linear inverse problems with Gaussian Mixtures (GMs). We first recap the well-known analytical result for pos- terior distributions of linear inverse problems with Gaussian prior; then we extend the result to the Gaussian Mixtures case. In the Gaussian case, the solution of the linear inverse problem is well-known ∼ [15]. If m is a random vector Gaussian distributed, m N(μm, Cm), with mean μm and covariance Cm; and G is a linear operator that transforms the model m into the observable data d
d = Gm + ε, (1) where ε is a random vector that represents an error with Gaussian distribution N(0, Cε) independent of the model m; then the posterior conditional distribution of m|d is Gaussian with mean and covariance given by = + T T + −1 − μm|d μm CmG GCmG Cε (d Gμm) (2) T T −1 Cm|d = Cm − CmG GCmG + Cε GCm. (3) This result is based on two well known properties of the Gaussian distributions: (A) the linear transform of a Gaussian distribution is again Gaussian; (B) if the joint distribution (m, d) is Gaussian, then the conditional distribution m|d is again Gaus- sian. These two properties can be extended to the Gaussian Mixtures case. We as- sume that x is a random vector distributed according to a Gaussian Mixture with = Nc ; k k Nc components, f(x) k=1 πkN(x μx, Cx), where πk are the weights and the ; k k k distributions N(x μx, Cx) represent the Gaussian components with means μx and k covariances Cx evaluated in x. By applying property (A) to the Gaussian compo- nents of the mixture, we can conclude that, if L is a linear operator, then y = Lx is distributed according to a Gaussian Mixture. Moreover, the pdf of y is given by = Nc ; k k T f(y) k=1 πkN(y Lμx, LCxL ). Similarly we can extend property (B) to conditional Gaussian Mixture distribu- tions. The well-known result of the conditional multivariate Gaussian distribution has already been extended to multivariate Gaussian Mixture models (see, for exam- 242 D. Grana et al. ple, [1]). In particular, if (x1, x2) is a random vector whose joint distribution is a Gaussian Mixture