<<

LU- and Positive Definite Matrices

Tom Lyche

University of Oslo Norway

LU-factorization and Positive Definite Matrices – p. 1/49 Topics Today

Block multiplication of matrices Basics on triangular matrices LU factorization of matrices Positive definite matrices examples criteria for positive definiteness LU factorization of positive definite matrices

LU-factorization and Positive Definite Matrices – p. 2/49 Partitioned matrices

A rectangular A can be partitioned into submatrices by drawing horizontal lines between selected rows and vertical lines between 123 selected columns. For example, A = 456 can be partitoned as 789 h i 1 2 3 1 2 3 A11 A12 (i) =  4 5 6  , (ii) a.1, a.2, a.3 =  4 5 6  , A21 A22  7 8 9  h i  7 8 9            T a1. 1 2 3 1 2 3 T (iii) a2. =  4 5 6  , (iv) A11, A12 =  4 5 6  . aT   7 8 9  h i  7 8 9   3.           The submatrices in a partition is often referred to as blocks and a partitioned matrix is sometimes called a .

LU-factorization and Positive Definite Matrices – p. 3/49 Column partition

Suppose A Rm,p and B Rp,n. ∈ ∈ If B = b.1,..., b.n is partitioned into columns then the partition of the producth AB intoi columns is

AB = Ab.1, Ab.2,..., Ab.n . h i In particular, if I is the of order p then

A = AI = A e1, e2,..., ep = Ae1, Ae2,..., Aep h i h i and we see that column j of A can be written Aej for j = 1,...,p.

LU-factorization and Positive Definite Matrices – p. 4/49 Row partition

If A is partitioned into rows then

T T a1. a1.B  T   T  a2. a2.B AB = B =  .   .   .   .       T   T  am. am.B         T and taking A = Ip it follows that row i of B can be written ei B. It is often useful to write the matrix-vector product Ax as a linear combination of the columns of A

Ax = x a + x a + + x a . 1 .1 2 .2 ··· p .p One way to see that this is correct is to partition A into columns and x into rows.

LU-factorization and Positive Definite Matrices – p. 5/49 Rules for 2 2 blocks × p,r p,n−r If B = B , B , where B1 R and B2 R then 1 2 ∈ ∈ h i A B1, B2 = AB1, AB2 . h i h i A 1 k,p m−k,p If A = , where A1 R and A2 R then A2 ∈ ∈  

A1 A1B B = . A2 A2B     m,s m,p−s If A = A , A and B = B , B , where A1 R , A2 R , 1 2 1 2 ∈ ∈ B Rhs,p and iB Rp−s,nh then i 1 ∈ 2 ∈

B1 A1, A2 = A1B1 + A2B2 . B  h i 2 h i  

LU-factorization and Positive Definite Matrices – p. 6/49 The general rule for 2 2 blocks ×

A11 A12 B11 B12 If A = and B = then A21 A22 B21 B22    

A11 A12 B11 B12 A11B11 + A12B21 A11B12 + A12B22 = , A21 A22 B21 B22 A21B11 + A22B21 A21B12 + A22B22       provided the vertical partition line in A matches the horizontal line in

B, i.e. the number of columns in A11 and A21 equals the number of

rows in B11 and B12.

LU-factorization and Positive Definite Matrices – p. 7/49 The general case

If

A11 A1s B11 B1q . ··· . . ··· . A =  . .  , B =  . .  ,     Ap1 Aps Bs1 Bsq  ···   ···      and if all the matrix products in

s

Cij = AikBkj, i = 1,...,p, j = 1,...,q kX=1 are well defined then

C11 C1q . ··· . AB =  . .  .   Cp1 Cpq  ···   

LU-factorization and Positive Definite Matrices – p. 8/49 Block-Triangular Matrices

Lemma 1. Suppose

A11 A12 A =  0 A22   where A, A11 and A22 are square matrices. Then A is nonsingular if and only if both

A11 and A22 are nonsingular. In that case

−1 −1 −1 A11 A11 A12A22 A−1 = − (1)  −1  0 A22  

LU-factorization and Positive Definite Matrices – p. 9/49 Proof ⇐ If A11 and A12 are nonsingular then

−1 −1 −1 A A A12A A11 A12 I 0 11 − 11 22 = = I  −1      0 A22 0 A22 0 I       and A is nonsingular with the indicated inverse.

LU-factorization and Positive Definite Matrices – p. 10/49 Proof ⇒ Conversely, let B be the inverse of the nonsingular matrix A. We partition B conformally with A and have

B11 B12 A11 A12 I 0 BA = = = I B21 B22  0 A22 0 I       Using block-multiplication we find

B11A11 = I, B21A11 = 0, B21A12 + B22A22 = I.

The first equation implies that A11 is invertible, this in turn implies that

B21 = 0 in the second equation, and then the third equation simplifies to

B22A22 = I. We conclude that also A22 is invertible.

LU-factorization and Positive Definite Matrices – p. 11/49 The inverse

Consider now a . n,n Lemma 2. An upper (lower) triangular matrix A = [aij] R is ∈ nonsingular if and only if the entries aii, i = 1,...,n are nonzero. In that case the inverse is upper (lower) triangular with diagonal −1 entries aii , i = 1,...,n. Proof: We use induction on n. The result holds for n = 1: The 1-by-1 matrix A =(a11) is invertible if and only if a11 = 0 −1 −1 6 and in that case A =(a11 ). Suppose the result holds for n = k and let A = Rk+1,k+1 be upper triangular. ∈

LU-factorization and Positive Definite Matrices – p. 12/49 Proof

We partition A in the form

Ak ak A =  0 ak+1,k+1   and note that A Rk,k is upper triangular. By Lemma 1.1 A is k ∈ nonsingular if and only if Ak and (ak+1,k+1) are nonsingular and in that case −1 −1 −1 A A aka A−1 = k − k k+1,k+1 .  −1  0 ak+1,k+1   By the induction hypothesis Ak is nonsingular if and only if the diagonal −1 entries a11,...,akk of Ak are nonzero and in that case Ak is upper −1 triangular with diagonal entries aii , i = 1,...,k. The result for A follows.

LU-factorization and Positive Definite Matrices – p. 13/49 Unit Triangular Matrices

Lemma 3. The product C = AB =(cij) of two upper(lower) triangular matrices

A =(aij ) and B =(bij) is upper(lower) triangular with diagonal entries cii = aiibii for all i.

Proof. Exercise.

A matrix is unit triangular if it is triangular with 1’s on the diagonal. Lemma 4. For a unit upper(lower) triangular matrix A Rn,n: ∈ 1. A is invertible and the inverse is unit upper(lower) triangular.

2. The product of two unit upper(lower) triangular matrices is unit upper(lower) triangular.

Proof. 1. follows from Lemma 1.2, while Lemma 1.3 implies 2.

LU-factorization and Positive Definite Matrices – p. 14/49 LU-factorization

We say that A = LU is an LU-factorization of A Rn,n if L Rn,n is ∈ ∈ lower triangular and U Rn,n is upper triangular. In addition we will ∈ assume that L is unit triangular. Example 1. The equation

2 1 1 0 2 1 A = − = −  1 2   1/2 1 0 3/2 − −       gives an LU -factorization of the 2-by-2 matrix A.

LU-factorization and Positive Definite Matrices – p. 15/49 Example

Every nonsingular matrix has a P LU-factorization, but not necessarily an LU-factorization. 0 1 Example 2. An LU -factorization of A = 1 1 must satisfy the equation   0 1 1 0 u1 u3 u1 u3 = = 1 1 l1 1  0 u2 l1u1 l1u3 + u2         for the unknowns l1 in L and u1,u2,u3 in U . Comparing (1, 1)-elements we see that u1 = 0, which makes it impossible to satisfy the condition 1= l1u1 for the (2, 1) element. We conclude that A has no LU -factorization.

LU-factorization and Positive Definite Matrices – p. 16/49 Uniqueness

Theorem 5. The LU -factorization of a nonsingular matrix is unique whenever it exists.

Proof. x

Suppose A = L1U 1 = L2U 2 are two LU - of the nonsingular matrix A. −1 −1 The equation L1U 1 = L2U 2 can be written in the form L2 L1 = U 2U 1 , −1 −1 where by lemmas 1.2-1.4 L2 L1 is unit lower triangular and U 2 U 1 is upper triangular.

But then both matrices must be diagonal with ones on the diagonal. −1 −1 We conclude that L2 L1 = I = U 1U 2 which means that L1 = L2 and U 1 = U 2.

LU-factorization and Positive Definite Matrices – p. 17/49 Leading Principal Submatrices

Suppose A Cn,n. The upper left k k corners ∈ ×

a11 a1k . ··· . Ak =  . .  for k = 1,...,n   ak1 akk  ···    of A are called the leading principal submatrices of A

LU-factorization and Positive Definite Matrices – p. 18/49 A Lemma

The following lemma will be used for existence. Lemma 6. Suppose A = LU is the LU -factorization of A Rn,n. For k = 1,...,n ∈ let Ak, Lk, U k be the leading principal submatrices of A, L, U , respectively. Then

Ak = LkU k is the LU -factorization of Ak for k = 1,...,n. Proof: We partition A = LU as follows:

Ak Bk Lk 0 U k Vk A = = = LU, (2) Ck Dk Mk Nk  0 Wk       where D , N ,W Rn−k,n−k. k k k ∈

LU-factorization and Positive Definite Matrices – p. 19/49 Proof

Using block-multiplication we find the equations

Ak = LkU k (3)

Bk = LkVk (4)

Ck = MkU k (5)

Dk = MkVk + NkWk (6)

Since Lk is unit lower triangular and U k is upper triangular we see that (3) gives the LU-factorization of Ak.

LU-factorization and Positive Definite Matrices – p. 20/49 Existence

Theorem 7. Suppose A Rn,n is nonsingular. Then A has an LU -factorization if and ∈ only if the leading principal submatrices A are nonsingular for k = 1,...,n 1. k − Proof: Suppose A is nonsingular with the LU-factorization A = LU. Since A is nonsingular it follows that L and U are nonsingular. By Lemma 6 we have Ak = LkU k. Since Lk is unit lower triangular it is nonsingular.

Moreover U k is nonsingular since its diagonal entries are among the nonzero diagonal entries of U. But then Ak is nonsingular.

LU-factorization and Positive Definite Matrices – p. 21/49 Proof continued

Conversely, suppose A = An is nonsingular and Ak is nonsingular for k = 1,...,n 1. We use induction on n to show that A has a − LU-factorization. The result is clearly true for n = 1, since the

LU-factorization of a 1-by-1 matrix is (a11) = (1)(a11). Suppose that

A1,..., An−1 are nonsingular implies that An−1 has an LU-factorization, and suppose that A1,..., An are nonsingular. To show that A = An has a LU-factorization we consider (3)-(6) with k = n 1. In this case C and − k Mk are row vectors, Bk and Vk are column vectors, and Dk =(ann),

Nk = (1), and Wk =(unn) are 1-by-1 matrices, i.e. scalars. The

LU-factorization of An−1 is given by (3), and since An−1 is nonsingular we see that Ln−1 and U n−1 are nonsingular. But then (4) has a unique solution Vn−1, (5) has a unique solution Mn−1, and setting Nn−1 = (1) in (6) we obtain u = a M V . Thus we have constructed an nn nn − n−1 n−1 LU-factorization of A.

LU-factorization and Positive Definite Matrices – p. 22/49 LDLT -factorization

For a the LU-factorization can be written in a special form. Definition 8. Suppose A Rn,n. A factorization A = LDLT , where L is unit lower ∈ triangular and D is diagonal is called an LDLT -factorization. Theorem 9. Suppose A Rn,n is nonsingular. Then A has an LDLT -factorization if ∈ and only if A = AT and A is nonsingular for k = 1,...,n 1. k −

LU-factorization and Positive Definite Matrices – p. 23/49 Proof

Proof: If A1,..., An−1 are nonsingular then Theorem 7 implies that A has an LU-factorization A = LU. Since A is nonsingular it follows that U is nonsingular and since U is triangular the

D = diag(u11,...,unn) is nonsingular (cf. Lemma 2). We can then factor A further as A = LDM T where M T = D−1U. It is easy to see that M T is unit upper triangular and since AT = A we find AT =(LDM T )T = MDLT = LU = A. Now M(DLT ) and LU are two LU-factorizations of A and by the uniqueness of the LU-factorization we must have M = L. Thus A = LDM T = LDLT is an LDLT -factorization of A. Conversely, if A = LDLT is an LDLT -factorization of A then A is symmetric since LDLT is symmetric and A has an LU-factorization with T U = DL . By Theorem 7 we conclude that A1,..., An−1 are nonsingular.

LU-factorization and Positive Definite Matrices – p. 24/49 Quadratic Forms

Suppose A Rn,n is a . The function f : Rn R given ∈ → by n n T f(x)= x Ax = aij xixj i=1 j=1 X X is called a . We say that A is (i) positive definite if AT = A and xT Ax > 0 for all nonzero x Rn. ∈ (ii) positive semidefinite AT = A and xT Ax 0 for all x Rn. ≥ ∈ (iii) negative (semi-)definite if A is positive (semi-) definite. −

LU-factorization and Positive Definite Matrices – p. 25/49 Some Observations

A matrix is positive definite if it is positive semidefinite and in addition

xT Ax = 0 x = 0. (7) ⇒ The zero-matrix is positive semidefinite; a positive definite matrix must be nonsingular. Indeed, if Ax = 0 for some x Rn then xT Ax = 0 which by (7) ∈ implies that x = 0.

LU-factorization and Positive Definite Matrices – p. 26/49 Example T

Example 3. The 3-by-3

2 1 0 − T = T 3 =  1 2 1  = tridiag ( 1, 2, 1) (8) − − 3 − −  0 1 2   −    is positive definite. Clearly T is symmetric. Now it can be shown that

xT T x = x2 + x2 +(x x )2 +(x x )2. 1 3 2 − 1 3 − 2 Thus xT T x 0 and if xT T x = 0 then x = x = 0 and x = x = x which ≥ 1 3 1 2 3 implies that also x2 = 0. Hence T is positive definite.

LU-factorization and Positive Definite Matrices – p. 27/49 Example B

Example 4. Let A = BT B, where B Rm,n and m, n are positive integers. (Note ∈ that B can be a rectangular matrix). Since AT =(BT B)T = BT B we see that A is symmetric. Moreover, for any x Rn ∈ 2 xT Ax xT BT Bx Bx T Bx Bx (9) = =( ) ( )= 2.

Since the Euclidian norm of a vector is nonnegative this shows that A is positive k k2 semidefinite and that A is positive definite if and only if B has linearly independent columns.

Note that A and B have the same null-space and the same rank.

LU-factorization and Positive Definite Matrices – p. 28/49 The

Example 5. Suppose F (t)= F (t1,...,tn) is a real valued function of n variables which has continuous 1. and 2. order partial derivatives for t in some domain Ω. For each t Ω the and Hessian of F are given by ∈ 2 2 ∂F (t) ∂ F (t) ∂ F (t) ∂t1 ∂t1∂t1 ... ∂t1∂tn . n . . n,n F (t)=  .  R , H(t)=  . .  R . ∇ ∈ 2 2 ∈  ∂F (t)   ∂ F (t) ∂ F (t)   n   n 1 ... n n   ∂t   ∂t ∂t ∂t ∂t      It is shown in advanced calculus texts that under suitable conditions on the domain Ω the matrix H(t) is symmetric for each t Ω. Moreover if F (t∗) = 0 and H(t∗) is ∈ ∇ positive definite then t∗ is a local minimum for F . This can be shown using second-order Taylor approximation of F . Moreover, t∗ is a local maximum if F (t∗) = 0 and H(t∗) ∇ is negative definite.

LU-factorization and Positive Definite Matrices – p. 29/49 When is a Matrix Positive Definite?

Not all symmetric matrices are positive definite, and sometimes we can tell just by glancing at the matrix that it cannot be positive definite. Examples:

0 1 1 2 2 1 A1 = , A2 = , A3 = − . 1 1 2 2  1 2       A : If a diagonal entry a 0 for some i then eT Ae = a 0 and 1 ii ≤ i i ii ≤ A is not positive definite.

A2: if the absolute value of the largest entry of A is not (only) on the diagonal then A is not positive definite. To show this suppose a a and a a for some i = j. Since A ij ≥ ii ij ≥ jj 6 is symmetric we obtain (e e )T A(e e )= a + a 2a 0 i − j i − j ii jj − ij ≤ which implies that xT Ax 0 for some x = 0. ≤ 6

LU-factorization and Positive Definite Matrices – p. 30/49 Submatrices

To give citeria for positive definiteness we start with two lemmas. Lemma 10. The leading principal submatrices of a positive definite matrix are positive definite and hence nonsingular.

Proof. Consider a leading principal submatrix Ak of the positive definite n,n k matrix A R . Clearly Ak is symmetric. Let x R be nonzero, set ∈ ∈ x n Ak Bk y = [ 0 ] R , and partition A conformally with y as A = , ∈ Ck Dk n−k,n−k where Dk R . Then h i ∈ A B x 0 < yT Ay = [xT 0T ] k k = xT A x. 0 k "Ck Dk#" #

LU-factorization and Positive Definite Matrices – p. 31/49 LU-factorization

Lemma 11. A matrix is positive definite if and only if it has an LDLT -factorization with positive diagonal entries in D. Proof: It follows from Lemma 10 and 9 that A has an LDLT -factorization T A = LDL . We need to show that the diagonal entries dii of D are positive. With ei the ith unit vector we find

T T −1 −T T dii = ei Dei = ei L AL ei = xi Axi,

−T −T where xi = L ei is nonzero since L is nonsingular. Since A is T positive definite we see that dii = xi Axi > 0 for i = 1,...,n.

LU-factorization and Positive Definite Matrices – p. 32/49 Proof of the converse

Conversely, if A has an LDLT -factorization with positive diagonal entries in D then we can write

A = RT R, RT := LD1/2, (10)

1/2 1/2 1/2 1/2 where D = diag(d11 ,...,dnn ). Since L and D are nonsingular it follows that R is nonsingular and A is positive definite by Example 4. 

LU-factorization and Positive Definite Matrices – p. 33/49 Cholesky Factorization

The factorization A = RT R in (10) is quite useful and has a special name. Definition 12. A factorization A = RT R where R is upper triangular with positive diagonal entries is called a Cholesky-factorization. 2 −1 T Example 6. The matrix A = −1 2 has an LDL - and Cholesky-factorization given by  

2 1 1 0 2 0 1 1/2 − = −  1 2   1/2 1 0 3/2 0 1  − −         √2 0 √2 1/√2 = − .  1/√2 3/2  0 3/2  −  p   p 

LU-factorization and Positive Definite Matrices – p. 34/49 Positive Eigenvalues

Lemma 13. A matrix is positive definite if and only if it is symmetric and has positive eigenvalues.

Proof. x

If A is positive definite then by definition A is symmetric.

Suppose Ax = λx with x = 0. Multiplying both sides by xT and solving for λ we xT Ax 6 find λ = xT x > 0. Suppose conversely that A Rn,n is symmetric with positive eigenvalues ∈ T λ1,...,λn. From the we have U AU = D, where T T U U = UU = I and D = diag(λ1,...,λn). Let x Rn be nonzero and define c := U T x =[c ,...,c ]T . Then ∈ 1 n cT c = xT UU T x = xT x so c is nonzero. Since x = Uc we find T T T T T n 2 x Ax =(Uc) AUc = c U AUc = c Dc = j=1 λj cj > 0 and it follows that A is positive definite. P

LU-factorization and Positive Definite Matrices – p. 35/49 Necessary and Sufficient Conditions

Theorem 14. The following is equivalent for a symmetric matrix A Rn,n ∈ 1. A is positive definite.

2. A has only positive eigenvalues.

3.

a11 ... a1k . . . . > 0 for k = 1,...,n

ak1 ... akk

4. A = BT B for a nonsingular B Rn,n ∈ Proof By Lemma 13 we know that 1 2. We show that 1 3 4 1. ⇔ ⇒ ⇒ ⇒

LU-factorization and Positive Definite Matrices – p. 36/49 Proof

1 3: By Lemma 10 the leading principal submatrix A of A is positive ⇒ k definite, and hence has positive eigenvalues by Lemma 13. Since the of a matrix equals the product of its eigenvalues we conclude that det(Ak) > 0 for k = 1,...,n. 3 4: The condition det(A ) > 0 implies that A is nonsingular for ⇒ k k k = 1,...,n. By Theorem 9 A has an LDLT -factorization A = LDLT .

Let Lk and Dk be the leading principal submatrices of order k of L and D, respectively. By partitioning A, L and D similarly to the proof of T T Lemma 6 we see that Ak = LkDkLk is the LDL -factorization of Ak for k = 1,...,n. Using properties of we find T det(Ak) = det(Lk) det(Dk) det(Lk ) = det(Dk)= d11 ...dkk > 0. Since this holds for k = 1,...,n we conclude that D has positive diagonal entries and we have A = BT B with B := R as in (10). 4 1: This follows from the discussion in Example 4. ⇒

LU-factorization and Positive Definite Matrices – p. 37/49 Useful facts

Suppose A Rn,n is positive definite and let , be the usual inner ∈ h· ·i product on Rn.

1. A has a set of eigenvectors u1,..., un that form an for Rn. 2. For any x Rn we have x = n c u , where c = x, u := xT u . ∈ j=1 j j j h ji j n n 3. For any x R we have Ax =P λjcj uj, where λ1,...,λn are ∈ j=1 the eigenvalues of A. P 4. Furthermore Ax, x = n λ c2. h i j=1 j j 5. cond (A) := A A−1P = λmax , where λ and λ are the 2 k k2k k2 λmin max min largest and smallest eigenvalue of A.

LU-factorization and Positive Definite Matrices – p. 38/49 Proof

1. By the spectral theorem it has a set of n orthonormal eigenvectors u ,..., u that form an orthonormal basis for Rn . { 1 n} 2. Since u ,..., u form a basis for Rn we have x = n c u for { 1 n} j=1 j j some c ,...,c . Taking the inner product with u we find by 1 n i P orthonormality x, u = n c u , u = c for i = 1,...,n. h ii j=1 jh j ii i n 3. We apply A to the equationP x = j=1 cj uj. n n n 2 4. Ax, x = λiciui, cjuPj = λj c . h i h i=1 j=1 i j=1 j 5. The eigenvaluesP of A areP positive and sinceP A is symmetric it is normal and hence cond A |λmax| λmax . 2( )= |λmin| = λmin

LU-factorization and Positive Definite Matrices – p. 39/49 Example

For a positive integer m and real numbers a, b we consider the m-by-m tridiagonal matrix given by

b a 0 ... 0  a b a ... 0  . . . C := tridiag (a, b, a)=  ......  . (11) m      0 ... a b a       0 ... a b      Note that C is symmetric CT = C. We obtain the second derivative matrix when a = 1 and b = 2. −

LU-factorization and Positive Definite Matrices – p. 40/49 C is positive definite if b> 0 and b 2 ≥ | Since C is symmetric it is enough to show that the smallest

eigenvalue λmin is positive.

The eigenvalues are λj = b + 2a cos(jπh) for j = 1,...,m, where h = 1/(m + 1). For C to be positive definite it is necessary that the diagonal entry b> 0. If b> 0 then λ = b 2 a cos(πh) >b 2 a 0 min − | | − | |≥ Thus C is positive definite

LU-factorization and Positive Definite Matrices – p. 41/49 Finding the Cholesky factor R

To solve a linear system Ax = b where A is positive definite we first compute the Cholesky-factorization A = RT R of A and then solve two triangular systems RT y = b and Rx = y by forward and backward substitution. Consider finding the Cholesky factorization of A. Since A = RT R and R is upper triangular we find

n min(j,k) akj = rikrij = rikrij, j,k = 1,...,n. (12) i=1 i=1 X X

LU-factorization and Positive Definite Matrices – p. 42/49 Cholesky Factorization Algorithm

Solving for rkj we find

k−1 1/2 r = a r2 , kk kk − ik i=1 X  (13) k−1 r = a r r /r j = k + 1,...,n. kj kj − ik ij kk i=1 X 

for k = 1, 2,...,n s = R(1:k 1,k); R(k,k)=(A(k,k) sT s)1/2; − − ∗ R(k,k+1:n)= A(k,k+1:n) sT R(1:k 1,k+1:n) /R(k,k); − ∗ − end 

LU-factorization and Positive Definite Matrices – p. 43/49 #flops

The number of flops needed for the Cholesky-factorization is given by

n n n (2k 2+(2k 1)(n k)) 2k(n k) 2x(n x)dx = n3/3. − − − ≈ − ≈ 0 − kX=1 kX=0 Z In addition we need to take n square roots. This is half the number of flops needed on of an arbitrary matrix. We obtain this reduction since the Cholesky factorization takes advantage of the symmetry of A.

LU-factorization and Positive Definite Matrices – p. 44/49

In many applications the matrix A has a banded structure, and the number of flops can be reduced.

We say that A has lower bandwidth p if aij = 0 whenever i>j + p,

and upper bandwidth q if aij = 0 whenever j > i + q. A diagonal matrix has upper and lower bandwidth zero, a matrix with upper and lower bandwidth one is tridiagonal. if A is symmetric then p = q. It is easy to extend the algorithm to band-matrices.

LU-factorization and Positive Definite Matrices – p. 45/49 A Lemma

We first show that if A = RT R then R has the same upper bandwidth as A. Lemma 15. Suppose A is positive definite with Cholesky-factorization A = RT R. If akj = 0 for j>k + d, then also rkj = 0 for j>k + d.

Proof. We show that if R has upper bandwidth d in its first k 1 rows then row k also − has upper bandwidth d. The proof then follows by induction on k. Now, if j>k + d, then a = 0, and if R has upper bandwidth d in its first k 1 rows then for i>k + d we kj − have r = 0 for j = 1,...,k 1. From(13) it follows that r = 0 for i>k + d. ji − ki

LU-factorization and Positive Definite Matrices – p. 46/49 Banded Cholesky Algorithm

Full version:

for k = 1, 2,...,n s = R(1:k 1,k); R(k,k)=(A(k,k) sT s)1/2; − − ∗ R(k,k+1:n)= A(k,k+1:n) sT R(1:k 1,k+1:n) /R(k,k); − ∗ − end 

Banded version:

for k = 1, 2,...,n km = max(1,k d); kp = min(n, k + d); − s = R(km:k 1,k); R(k,k)= sqrt(A(k,k) sT s); − − ∗ R(k,k+1:kp)= A(k,k+1:kp) sT R(km:k 1,k+1:kp) /R(k,k); − ∗ − end 

LU-factorization and Positive Definite Matrices – p. 47/49 RT x = b with R upper bandwidth d

To solve Ax = b where A Rn,n is positive definite with bandwidth d we ∈ can use the banded Cholesky Algorithm followed by a simple modification of the forward and backward substitution algorithms. In the forward substitution we use L = RT , but do the calculations using the entries in R: Algorithm 16. x for k =1: n km = max(1,k d); − x(k)= b(k) x(km:k 1)T R(km:k 1,k) /R(k,k); − − ∗ − end 

LU-factorization and Positive Definite Matrices – p. 48/49 Rx = y with R upper bandwidth d

Algorithm 17. x for k = n : 1 : 1 − kp = min(n, k + d); x(k)= y(k) R(k,k+1:kp) x(k+1:kp) /R(k,k); − ∗ end 

The number of flops for these algorithms are: O(2nd2) for the banded Cholesky O(4nd) for backward and forward substitution. When d is small compared to n we see that these numbers are considerably smaller than the O(n3/3) and O(2n2) counts for the factorization of a full matrix.

LU-factorization and Positive Definite Matrices – p. 49/49