MAT 610: Numerical Linear Algebra

MAT 610: Numerical Linear Algebra

MAT 610: Numerical Linear Algebra James V. Lambers January 16, 2017 2 Contents 1 Matrix Multiplication Problems7 1.1 Introduction............................7 1.1.1 Systems of Linear Equations..............7 1.1.2 The Eigenvalue Problem.................8 1.2 Basic Concepts, Algorithms, and Notation...........8 1.2.1 Vector Spaces.......................9 1.2.2 Subspaces and Bases................... 10 1.2.3 Matrix Representation of Linear Transformations... 14 1.2.4 Matrix Multiplication.................. 16 1.2.5 Vector Operations for Matrices............. 17 1.2.6 The Transpose of a Matrix............... 18 1.2.7 Other Fundamental Matrix Computations....... 19 1.2.8 Understanding Matrix-Matrix Multiplication..... 22 1.3 Other Essential Ideas from Linear Algebra........... 22 1.3.1 Special Subspaces.................... 22 1.3.2 The Identity Matrix................... 23 1.3.3 The Inverse of a Matrix................. 23 1.3.4 Triangular and Diagonal Matrices........... 24 1.3.5 The Sherman-Morrison-Woodbury Formula...... 25 1.3.6 The Determinant of a Matrix.............. 27 1.3.7 Differentiation of Matrices................ 29 2 Matrix Analysis 31 2.1 Vector Norms........................... 31 2.2 Matrix Norms........................... 34 2.3 Eigenvalues and Eigenvectors.................. 37 2.4 Perturbations and the Inverse.................. 40 2.5 Roundoff Errors and Computer Arithmetic.......... 41 2.5.1 Floating-Point Numbers................. 42 3 4 CONTENTS 2.5.2 Properties of Floating-Point Systems.......... 43 2.5.3 Rounding......................... 44 2.5.4 Floating-Point Arithmetic................ 46 2.6 Basics of Error Analysis..................... 48 2.7 Orthogonality........................... 51 2.8 The Singular Value Decomposition............... 54 2.9 Projections............................ 58 2.10 Projections and the SVD.................... 60 2.11 Sensitivity of Linear Systems.................. 60 3 General Linear Systems 65 3.1 Gaussian Elimination and Back Substitution......... 65 3.2 The LU Factorization...................... 70 3.2.1 Unit Lower Triangular Matrices............. 71 3.2.2 Solution of Ax = b .................... 73 3.2.3 Implementation Details................. 76 3.2.4 Existence and Uniqueness................ 76 3.2.5 Practical Computation of Determinants........ 77 3.3 Roundoff Analysis of Gaussian Elimination.......... 77 3.3.1 Error in the LU Factorization.............. 78 3.3.2 Bounding the perturbation in A ............ 80 3.3.3 Bounding the error in the solution........... 81 3.4 Pivoting.............................. 82 3.4.1 Partial Pivoting..................... 82 3.4.2 Complete Pivoting.................... 83 3.4.3 The LU Decomposition with Pivoting......... 83 3.4.4 Practical Computation of Determinants, Revisited.. 85 3.5 Improving and Estimating Accuracy.............. 85 3.5.1 Iterative Refinement................... 85 3.5.2 Estimating the Condition Number........... 89 3.5.3 Scaling and Equilibration................ 91 4 Special Linear Systems 93 4.1 The LDM T and LDLT Factorizations............. 93 4.2 Positive Definite Systems.................... 95 4.3 Banded Systems......................... 100 5 Iterative Methods for Linear Systems 103 5.1 Stationary Iterative Methods.................. 104 5.1.1 Convergence....................... 104 CONTENTS 5 5.1.2 The Jacobi Method................... 105 5.1.3 The Gauss-Seidel Method................ 106 5.1.4 Successive Overrelaxation................ 106 5.1.5 Positive Definite Systems................ 107 5.1.6 Stopping Criteria..................... 108 5.2 Solving Differential Equations.................. 109 5.3 The Conjugate Gradient Method................ 111 5.4 Minimum Residual Methods................... 119 5.5 The Biconjugate Gradient Method............... 122 6 Orthogonalization and Least Squares 127 6.1 The Full-rank Linear Least Squares Problem......... 127 6.1.1 Minimizing the Residual................. 127 6.1.2 The Condition Number of AT A ............. 131 6.1.3 Comparison of Approaches............... 132 6.1.4 Solving the Normal Equations.............. 132 6.1.5 The Pseudo-Inverse................... 132 6.1.6 Perturbation Theory................... 135 6.2 The QR Factorization...................... 136 6.2.1 Givens Rotations..................... 136 6.2.2 Householder Reflections................. 143 6.2.3 Givens Rotations vs. Householder Reflections..... 150 6.3 Block Algorithms......................... 150 6.4 Gram-Schmidt Orthogonalization................ 151 6.4.1 Classical Gram-Schmidt................. 152 6.4.2 Modified Gram-Schmidt................. 153 6.5 The Rank-Deficient Least Squares Problem.......... 155 6.5.1 QR with Column Pivoting................ 155 6.5.2 Complete Orthogonal Decomposition.......... 158 6.6 Applications of the SVD..................... 159 6.6.1 Minimum-norm least squares solution......... 159 6.6.2 Closest Orthogonal Matrix............... 160 6.6.3 Low-Rank Approximations............... 160 7 The Unsymmetric Eigenvalue Problem 163 7.1 Properties and Decompositions................. 163 7.2 Perturbation Theory....................... 170 7.3 Power Iterations......................... 175 7.4 The Hessenberg and Real Schur Forms............. 178 7.5 The Practical QR Algorithm.................. 182 6 CONTENTS 8 The Symmetric Eigenvalue Problem 187 8.1 Properties and Decompositions................. 187 8.2 Perturbation Theory....................... 188 8.3 Power Iterations......................... 190 8.4 Reduction to Tridiagonal Form................. 192 8.5 The Practical QR Algorithm.................. 193 8.6 Jacobi Methods.......................... 193 8.6.1 The Jacobi Idea..................... 194 8.6.2 The 2-by-2 Symmetric Schur Decomposition...... 195 8.6.3 The Classical Jacobi Algorithm............. 196 8.6.4 The Cyclic-by-Row Algorithm.............. 196 8.6.5 Error Analysis...................... 196 8.6.6 Parallel Jacobi...................... 197 8.7 The SVD Algorithm....................... 197 8.7.1 Jacobi SVD Procedures................. 199 9 Matrices, Moments and Quadrature 201 9.1 Approximation of Quadratic Forms............... 201 9.1.1 Gaussian Quadrature.................. 203 9.1.2 Roots of Orthogonal Polynomials............ 207 9.1.3 Connection to Hankel Matrices............. 207 9.1.4 Derivation of the Lanczos Algorithm.......... 208 9.2 Generalizations.......................... 211 9.2.1 Prescribing Nodes.................... 211 9.2.2 The Case of u 6= v .................... 213 9.2.3 Unsymmetric Matrices.................. 215 9.2.4 Modification of Weight Functions............ 216 9.3 Applications of Moments..................... 218 9.3.1 Approximating the Scattering Amplitude....... 218 9.3.2 Estimation of the Regularization Parameter...... 219 9.3.3 Estimation of the Trace of the Inverse and Determinant219 Chapter 1 Matrix Multiplication Problems 1.1 Introduction This course is about numerical linear algebra, which is the study of the approximate solution of fundamental problems from linear algebra by nu- merical methods that can be implemented on a computer. We begin with a brief discussion of the problems that will be discussed in this course, and then establish some fundamental concepts that will serve as the foundation of our discussion of various numerical methods for solving these problems. 1.1.1 Systems of Linear Equations One of the most fundamental problems in computational mathematics is to solve a system of n linear equations a11x1 + a12x2 + ··· + a1nxn = b1 a21x1 + a22x2 + ··· + a2nxn = b2 . am1x1 + am2x2 + ··· + amnxn = bm for the n unknowns x1; x2; : : : ; xn. Many data fitting problems, such as polynomial interpolation or least-squares approximation, involve the solu- tion of such a system. Discretization of partial differential equations often yields systems of linear equations that must be solved. Systems of nonlinear equations are typically solved using iterative methods that solve a system 7 8 CHAPTER 1. MATRIX MULTIPLICATION PROBLEMS of linear equations during each iteration. In this course, we will study the solution of this type of problem in detail. 1.1.2 The Eigenvalue Problem As we will see, a system of linear equations can be described in terms of a single vector equation, in which the product of a known matrix of the coef- ficients aij, and an unknown vector containing the unknowns xj, is equated to a known vector containing the right-hand side values bi. Unlike multipli- cation of numbers, multiplication of a matrix and a vector is an operation that is difficult to understand intuitively. However, given a matrix A, there are certain vectors for which multiplication by A is equivalent to \ordinary" multiplication. This leads to the eigenvalue problem, which consists of find- ing a nonzero vector x and a number λ such that Ax = λx: The number λ is called an eigenvalue of A, and the vector x is called an eigenvector corresponding to λ. This is another problem whose solution we will discuss in this course. 1.2 Basic Concepts, Algorithms, and Notation Writing a system of equations can be quite tedious. Therefore, we instead represent a system of linear equations using a matrix, which is an array of elements, or entries. We say that a matrix A is m × n if it has m rows and n columns, and we denote the element in row i and column j by aij. We also denote the matrix A by [aij]. With this notation, a general system of m equations with n unknowns can be represented using a matrix A that contains the coefficients of the equations, a vector x that contains the unknowns, and a vector

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    221 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us