Uncertainty Quantification in Linear Inverse Problems with Dimension

Uncertainty Quantification in Linear Inverse Problems with Dimension

Uncertainty quantification in linear inverse problems with dimension reduction Leandro Passos de Figueiredo, LTrace and UFSC; Dario grana, UW; Leonardo Azevedo, IST; Mauro Roisenberg, UFSC; Bruno B. Rodrigues PETROBRAS S/A – CENPES/PDGP/CMR Copyright 2019, SBGf - Sociedade Brasileira de Geof´ısica. and under the Gaussian assumption for the prior and This paper was prepared for presentation at the 16th International Congress of the noises, the posterior distribution is analytically obtained, Brazilian Geophysical Society, held in Rio de Janeiro, Brazil, August 19-22, 2019. leading to very fast algorithms (Buland and Omre, 2003; Contents of this paper were reviewed by the Technical Committee of the 16th de Figueiredo et al., 2014). International Congress of The Brazilian Geophysical Society and do not necessarily represent any position of the SBGf, its officers or members. Electronic reproduction Dimension reduction, or compression, is a mathematical or storage of any part of this paper for commercial purposes without the written consent of The Brazilian Geophysical Society is prohibited. process used to reduce the number of variables of a problem (Roweis and Saul, 2000; Tompkins et al., 2011). Abstract In inverse problem, the reduction can be applied to both model parameters or observed data. In methods based Dimension reduction is a process used to simplify on ensemble Kalman filter, the reduction of the data can high dimensional inverse problems in geophysics. In improve the efficiency of algorithm. On the other hand, this work, we analyze the impact of the model and the efficiency of stochastic optimization methods depends data reduction on the uncertainty quantification of on the dimension of the model parameters (Marzouk and the inversion results. In the Bayesian approach for Najm, 2009; Lieberman et al., 2010). inverse problems, the solution is given by the posterior distribution and the uncertainty is interpreted as the With the objective of analyzing the impact of the data/model posterior variance. Under the Gaussian assumption reduction on the uncertainty of the inversion results, for the noise and prior distribution and if the we discuss a general linear inverse problem assuming forward model and reduction operator are linear, we a linear operator for dimension reduction. With these analytically obtain the posterior distribution. We use assumptions, the posterior can be analytically treated by the analytical result to compute the solution of the applying Multivariate Statistics results (Anderson, 1984). acoustic inversion problem with different reduction In the application section, we discuss the methodology levels of both data and model parameters. The method applied to the acoustic seismic inversion, where the forward allows us to analyze the impact of the linear reduction model can be linearized. on the uncertainty quantification of linear inverse Methodology problems. Bayesian linear inversion Introduction Assuming a general linear inverse problem, the objective Several geophysical applications can be faced as an is estimating the model parameters m based on the linear inverse problem, such as seismic inversion (Sen, 2006), forward operator G and the observed data d: tomography (Nolet, 1987) and inversion of CSEM data (Gunning et al., 2010). In all applications, we aim to d = Gm + ed (1) estimate the subsurface properties (model parameters) based on geophysical measurements (observed data) and where ed represents the seismic noise and modeling errors. theoretical relations (forward model) between the observed and the unobserved parameters (Tarantola, 2005). In the Bayesian approach for inverse problems, the solution is given by the posterior distribution The common techniques to solve an inverse problem include stochastic optimization methods, where the best p(mjd) ∝ p(djm)p(m); (2) subsurface model is chased by minimizing the error/misfit function (Sen, 2006; Azevedo et al., 2015; Bordignon where p(djm) is the likelihood distribution (relating the et al., 2017); and Bayesian sampling algorithms, where the model parameters to the observed data) and p(m) is the solution of the inverse problem is the posterior distribution prior distribution (relating the model parameters to prior of the subsurface properties (Buland and Omre, 2003; information). Tarantola, 2005; Bosch et al., 2010). In the Bayesian If the noise ed of Equation 1 is normally distributed with approach, the uncertainty is interpreted as the posterior zero mean, we obtain the following Gaussian multivariate variance, which is a inherent part of the solution (Gelman likelihood distribution: et al., 2004; Tarantola, 2005; Doyen, 2007). Although the 1 1 Bayesian algorithms might be inefficient for general inverse p( j ) = − ( − )T −1( − ) d m nq nd 1 exp d Gm Sd d Gm problems, it has been successfully applied to geophysical (2p) 2 jSdj 2 2 problems during the last decades (Gunning and Glinsky, 2004; Bosch et al., 2010; Grana and Della Rossa, ≡ N(d;Gm;Sd): (3) 2010; Grana et al., 2017; de Figueiredo et al., 2018b). where Sd is the covariance matrix of the seismic noise In particular cases where the forward model is linear (Buland and Omre, 2003). Sixteenth International Congress of The Brazilian Geophysical Society 2 Assuming a prior knowledge about the model parameters, where V˜ −1 = V˜ T (V˜ V˜ T )−1 is the generalized inverse of the such as a background model, we can also propose a non-square operator V˜ . Gaussian prior distribution: Equations 10 and 11 define the linear operators to transform and back-transform from/to the reduced space. p(m) = N(m;m m;Sm); (4) The scheme in Figure 1 illustrates the discussed framework for linear dimension reduction. where m m is the background model and Sm is the prior covariance matrix that can include spatial correlations (Buland and Omre, 2003; de Figueiredo et al., 2018a). Based on the statistical model presented in Equations 3 and 4 and applying standard results of multivariate statistical analysis (Anderson, 1984), we can analytically calculate the posterior distribution (Buland and Omre, Figure 1: Framework for linear dimension reduction based 2003) of the Bayesian linear inversion (BLI): on PCA. p(mjd) = N(m mjd;Smjd); (5) where the expectation value m mjd and the covariance matrix Dimension reduction in inverse problems S are: mjd • Data compression T T −1 m = m m +SmG (GSmG +Sd) (d − Gm m) (6) mjd Using the presented framework for data reduction, we obtain are −1 T T −1 d˜ = V˜ (d −m d); (12) Smjd = Sm −SmG (GSmG +Sd) GSm: (7) d where V˜ d is the matrix of the principal eigenvectors for Linear dimension reduction a given fraction of total variance fv, which is defined according to the Equations 8 and 9. Principal component analysis (PCA) is a mathematical method that allows obtaining a set of uncorrelated variables By applying Equation 12 to Equation 1 of a linear inverse from observed correlated variables, which maintains the full problem, we obtain observed variance (Pearson, 1901). PCA can be applied ˜ ˜ −1 to the covariance matrix to reduce the dimension of the d − Vd m d = Hm + ed˜ (13) problem by selecting only the first components that capture ˜ −1 ˜ −1 most of the variance. where H = Vd G and ed˜ = Vd ed is the reduced noise vector with covariance S = V˜ S V˜ T . For example, from a set of observations of the (n×1)-vector d˜ d d d x with a mean m and a (n × n)-covariance matrix S, we can Because all the operators are linear, we can derive the apply the Eigen Decomposition theorem to S: multivariate Gaussian posterior distribution by using the same results of multivariate statistical analysis that are T S = VLV ; (8) cited in the section Bayesian linear inversion (Equations 5- 7): where V is the matrix of orthogonal eigenvectors (principal ˜ p(mjd) = N(m;m mjd˜;Smjd˜); (14) components) and L is the diagonal matrix with the respective eigenvalues (variances). where the expectation value m mjd˜ and the covariance matrix S are: By arranging the eigenvalues in decreasing order, we mjd˜ can choose a subset of n˜ eigen vectors (with n˜ ≤ n) that T T −1 ˜ ˜ −1 represents a fraction fv of the total variance: m mjd˜ = m m +SmH (HSmH +Sd˜) d − Vd m d − Hm m (15) n˜ ∑ Li i and i=1 ; ≥ f ; (9) n v S = S −S HT (HS HT +S )−1HS : (16) ∑ j=1 L j; j Smjd˜ Sm Sm Sm Sd˜ Sm Then, any observation x can be represented a by a linear Equation 14 is the solution of the inverse problem for the combination of the n˜ eigenvectors: model parameter m given a dimension reduction of the observed data d. x = m + V˜ x˜; (10) • Model compression where V˜ is a (n × n˜)-matrix with the subset of n˜ principal eigenvectors (Tompkins et al., 2011), and x˜ is a (n˜ × 1)- vector that represents the sample x in a reduced linear Applying Equation 10 to the model parameter, we obtain space. m = m m + V˜ mm˜ ; (17) To find the coefficients x˜ that represents a given sample x in the reduced space, we can use the generalize inversion where V˜ m is the matrix of the principal eigenvectors for of matrices: a given fraction of total variance fv, which is defined x˜ = V˜ −1(x −m): (11) according to the Equations 8 and 9. Sixteenth International Congress of The Brazilian Geophysical Society DE FIGUEIREDO L. P. ET AL. 3 From Equations 1 and 17, we can write the forward model as d − Gm m = Fm˜ + ed; (18) where F = GV˜ m, m˜ is the reduced model parameter with Gaussian prior distribution p(m˜ ) = N(m˜ ;0;Sm˜ ); (19) and the covariance Sm˜ is a diagonal matrix with the eigenvalues of the prior covariance Sm. Given Equations 18 and 19, we can derive the posterior distribution of the reduced model parameter m˜ by using the standard results of multivariate statistical analisys (Equations 5-7).

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    5 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us