Generalization of Coloring Linear Transformation
Total Page:16
File Type:pdf, Size:1020Kb
SECTION BUILDING STRUCTURES & STRUCTURAL MECHANICS VOLUME: 18 | NUMBER: 2 | 2018 | DECEMBER GENERALIZATION OF COLORING LINEAR TRANSFORMATION Lukas NOVAK1, Miroslav VORECHOVSKY1 1Institute of Structural Mechanics, Faculty of Civil Engineering, Brno University of Technology, Veveri 331/95, Brno, Czech Republic [email protected], [email protected] DOI: 10.31490/tces-2018-0013 Abstract. The paper is focused on the technique of linear recall transformation methods such as Nataf or Rosenblatt transformation between correlated and uncorrelated transformation [1]). Gaussian random vectors, which is more or less commonly An analyst using reliability methods must often make used in the reliability analysis of structures. These linear decisions performed with incomplete probability transformations are frequently needed to transform information. Indeed, in many practical cases, only the uncorrelated random vectors into correlated vectors with estimated correlation or covariance matrix is specified a prescribed covariance matrix (coloring transformation), along with marginal distributions for individual random and also to perform an inverse (whitening) transformation, variables. In such a case, the information about the joint i.e. to decorrelate a random vector with a non-identity probability distribution function is not complete and one covariance matrix. Two well-known linear transformation has to construct a density that fulfills this incomplete techniques, namely Cholesky decomposition and eigen- information. A solution is to have an underlying Gaussian decomposition (also known as principal component random vector. This is a typical example where the analysis, or the orthogonal transformation of a covariance normal-to-anything transformation (NORTA), sometimes matrix), are shown to be special cases of the generalized known as the Nataf transformation, which is a special case linear transformation presented in the paper. The proposed of Rosenblatt transformation that considers the Gaussian generalized linear transformation is able to rotate the copula, is used to construct the joint probability density of transformation randomly, which may be desired in order the random vector. to remove unwanted directional bias. The conclusions presented herein may be useful for structural reliability We also note that mathematical operations involving analysis with correlated random variables or random multivariate Gaussian random vectors are almost the only fields. generally applicable tools currently available. Most of the methods from reliability engineering require two-way access from/to the transformed space of uncorrelated Gaussian random variables [2]. One can mention methods Keywords like Importance sampling, Asymptotic sampling, First Order Reliability Method (FORM) and Second Order Linear transformation, correlation, Cholesky Reliability Method (SORM), etc. Last but not least, most decomposition, eigen-decomposition, structural of the methods for the generation and analysis of random reliability, uncertainty quantification, random fields. fields are based on linear transformations between uncorrelated and correlated Gaussian random vectors. One of the simplest applications of linear 1. Motivation transformation is the generation of realizations from a correlated random vector using Monte Carlo type simulation techniques, which is commonly used in Uncertainty quantification or probabilistic engineering practical applications of structural reliability analysis. The analyses such as statistical, sensitivity or reliability aim of this study is to review and compare available analyses must often consider statistical dependencies transformation techniques and the development of the among random variables. Whatever joint distribution generalized linear transformation technique. In the function may be considered, the only analytically tractable following section, the basic mathematical principle behind random vector with correlated marginal distributions is the linear transformation of Gaussian random vectors is multivariate Gaussian distribution, and the real problem presented. with possibly non-Gaussian marginals is that they must often be transformed into Gaussian random vectors (we © 2018 TRANSACTIONS OF VSB - TECHNICAL UNIVERSITY OF OSTRAVA CIVIL ENGINEERING SERIES 31 SECTION BUILDING STRUCTURES & STRUCTURAL MECHANICS VOLUME: 18 | NUMBER: 2 | 2018 | DECEMBER 2. Linear Transformation transformed into standard Gaussian random vectors with covariance matrix Σ (in fact a correlation matrix because of standardized random variables). Given the target Let Z ∼ N (0, I) be the Gaussian random column vector correlation matrix for which the expected value E[ZiZj] = δij, so the columns of Z are uncorrelated standard variables. We would like to 1 Σ , (7) transform Z into a Gaussian random vector X ∼ N (0, Σ), 1 where Σ = E[XXT] is a target covariance matrix which is symmetric and positive definite. the Cholesky matrix reads A linear transformation maps from n-dimensional 10 L . (8) space to m-dimensional space. For simplicity, we shall 2 start with a linear transformation between spaces of the 1 same dimension: An explanatory illustration of the Cholesky linear TR: nn R. (1) transformation for the two-dimensional case is depicted in Fig. 1 for correlation coefficient 0.6 . Note that the The linear transformation of ZX (coloring) using transformation matrix L is a lower triangular matrix with a regular square transformation matrix A of order n then a unit on the first entry of the main diagonal, and therefore reads: the first coordinate x1 remains unchanged. X=AZ. (2) The inverse transformation (whitening) is simply Z=A1 X. In order to establish the transformation, one 4. Eigen-decomposition has to determine a suitable transformation matrix A [3], based on the covariance matrix Σ. The decomposition of a symmetric positive definite matrix There are two well-known choices for A that will be into eigenvectors and eigenvalues is usually called eigen- reviewed next: a transformation based on the Cholesky decomposition. The method is applied under many decomposition of Σ or on the eigen-decomposition of Σ. different names, such as proper orthogonal decomposition, principal component analysis, orthogonal transformation The geometric properties of the transformation are of a covariance matrix, the main ingredient of the given by matrix A, as can be seen from Eq. (2). It is clear Karhunen-Loève expansion, etc. Any positive definite that both vectors can be rotated, stretched or flipped. matrix can be decomposed as T1/21/2T ΣΦλΦΦλλΦ , (9) T 3. Cholesky Decomposition AAeig eig where λ is the diagonal matrix of (positive) eigenvalues The target covariance matrix can be decomposed using of Σ (best if ordered from largest to smallest) and Φ is a lower triangular (Cholesky) matrix L as: the eigenvector matrix associated with the eigenvalues. ΣLL T . (3) Individual eigenvectors of Σ are orthogonal (recall the symmetry of Σ) and they form columns of Φ . The The coloring linear transformation with ALchol coloring linear transformation with Aeig obtained by the obtained by Cholesky Σ then reads: eigen-decomposition of Σ reads: X=Achol Z LZ. (4) 1/2 X=Aeig Z Φλ Z. (10) This transformation guarantees that the uncorrelated vector Zgets transformed into X with the desired The backward transformation is obtained easily as covariance matrix: 1 Z= Φλ1/2 X λ 1/2 Φ T X. (11) TTTT XX LZ LZ L ZZ L (5) In this equation, the fact that the inverse of an orthogonal TT T 1T LZZL LLΣ . matrix is the transposed matrix ΦΦ can be used. 1/2 The backward transformation is obtained easily as Also, the diagonal matrix λ has the inverse square roots of eigenvalues on the main diagonal. Having Φ and ZLX 1 . (6) λ at hand, the backward transformation is simple. In order to visualize the transformation, a numerical example is presented that involves two independent To show that transformation (10) yields the required standard Gaussian random variables (vector Z) that are covariance matrix, one can substitute for Σ as follows: © 2018 TRANSACTIONS OF VSB - TECHNICAL UNIVERSITY OF OSTRAVA CIVIL ENGINEERING SERIES 32 SECTION BUILDING STRUCTURES & STRUCTURAL MECHANICS VOLUME: 18 | NUMBER: 2 | 2018 | DECEMBER Fig. 1: Linear transformations of the original sample from Z (left), selected as an orthogonal grid of points. The sample of X is obtained from Z using: Cholesky decomposition (middle) and eigen-decomposition (right). T where K is an arbitrary orthogonal matrix. To show that T1/21/2 such a transformation yields the required covariance XX Φλ Z Φλ Z XX matrix is simple: (12) Φλ1/2 ZZ T λ 1/2 Φ T ΦλΦ T Σ . T AZ AZ I gg eigen-decomposition has several advantages. First of XX (15) all, the transformation of realizations to correlated space X Φλ1/2 K ZZ T K T λ 1/2 Φ T ΦλΦ T Σ . keeps the original pattern from uncorrelated space Z. This is illustrated using realizations of 2D vectors in Fig. 1 I right. The solution to the eigenvalues and the associated In other words, the generalized transformation matrix orthonormal eigenvectors of the 2D correlation matrix in Ag is obtained using A eig by post-multiplication with an Eq. (7) is particularly simple: arbitrary orthogonal matrix and therefore establishes 10 1/21/2 infinitely many possible transformation matrices between λΦ, . (13) the correlated space X and the uncorrelated Z-space: 01 1/ 2 1/ 2 1/2 X=Ag Z Φλ K Z. (16) For computational purposes, a significant advantage of this method is the possibility of dimensional