<<

Generalized Eigenvectors and Systems of Linear Di¤erential Equations

Math 422

Consider a system of linear …rst order di¤erential equations x0 (t) = Ax (t) with initial value x (0) = x0; 1 and suppose that A is similar to a Hessenberg H = S AS: Let

1 1 y (t) =S x (t) and y0 = S x (0) = y (0) ; then 1 1 1 1 y0 (t) = S x0 (t) = S Ax (t) = S AS S x (t) = Hy (t) : 1 Thus the change of variables y (t) =S x (t) transforms the given initial value problem (IVP) into the equivalent one y0 (t) = Hy (t) with y (0) = y0: (1) 1 8 1 For example, the matrix A = 1 3 2 is similar to 2 4 16 7 3 4 5 1 4 1 1 0 0 1 8 1 1 0 0 H = 1 5 2 = 0 1 0 1 3 2 0 1 0 : 2 0 8 1 3 2 0 4 1 3 2 4 16 7 3 2 0 4 1 3 4 5 4 5 4 5 4 5 Thus the change of variables above transforms the system

x0 (t) 1 8 1 x1 (t) x1 (0) 7 1 x0 (t) = 1 3 2 x2 (t) with x2 (0) = 1 2 2 3 2 3 2 3 2 3 2 3 x0 (t) 4 16 7 x3 (t) x3 (0) 5 3 4 5 4 5 4 5 4 5 4 5 into y0 (t) 1 4 1 y1 (t) y1 (0) 7 1 y0 (t) = 1 5 2 y2 (t) with y2 (0) = 1 : 2 2 3 2 3 2 3 2 3 2 3 y0 (t) 0 8 1 y3 (t) y3 (0) 9 3 When H is reduced,4 Jordan5 4 Canonical Form5 (the4 …nal5 topic in this4 course)5 is4 required5 to solve the IVP in (1). When H is unreduced, however, we can solve the IVP in (1) by following the steps outlined below. But …rst we need some important theoretical results.

n 1 Theorem 1 Let H = (hij) be an unreduced n n Hessenberg matrix. Then e1;He1;:::;H e1 is linearly independent.   Proof. First note that

2 h11 + h12h21 b1 h11 h21h11 + h22h21 b2 h21 2 3 2 3 h32h21 b3 2 0 3 2 He1 = and H e1 = 6 0 7 = 6 0 7 ; 6 . 7 6 7 6 7 6 . 7 6 . 7 6 . 7 6 7 6 . 7 6 . 7 6 0 7 6 7 6 7 6 7 6 0 7 6 0 7 4 5 6 7 6 7 4 5 4 5 i 1 T since H is unreduced, h21 = 0 and b3 = h32h21 = 0: Inductively, assume that H e1 = [c1 ci 0 0] 6 6       with ci = 0: Then 6

1 h11 h12 h1;i h11c1 + + h1ici            h21 h22 h2;i c h c + + h c         1 21 1 2i i 2 . . . 3 . 2 h c    h c 3 .. . . . 32 2 + + 3i i 0 h32 h3;i . . 2 . 3 .   6 .    . . 7 6 . 7 i i 1 6 .. . . . 7 ci 6 . 7 H e1 = H H e1 = 6 0 0 . . . . 7 6 7 = 6 7 6    7 6 0 7 6 hi+1;ici 7 6 .. 7 6 7 6 7  6 0 0 . hi+1;i 7 6 . 7 6 0 7 6 7 6 . 7 6 7 6 . . . . .   . . 7 6 7 6 . 7 6 ...... 7 6 0 7 6 . 7 6 7 6 7 6 7 6 0 0 0 7 4 5 6 0 7 6 7 6 7 4         5 4 5 st i and hi+1;ici = 0: Thus for 1 i n 1 the (i + 1) component of H e1 is non-zero, and for 1 i n 2; 6 th   i n 1   i + 2 k n; the k component of H e1 is zero. Hence the matrix e1 He1 H e1 is upper   j j    j triangular with non-zero entries, and it follows that e ;He ;:::;Hn 1e is linearly independent. 1 1  1   T Corollary 2 Let H be an unreduced n n Hessenberg matrix. If [a0 an 1] = 0, then     6 n 1 a0e1 + a1He1 + + an 1H e1 = 0:    6 De…nition 3 Let A be an n n matrix and let  be an eigenvalue of A: A vector v Rn is a generalized eigen-  r 1 r 2 vector of order r associated with  if (A I) v = 0 and (A I) v = 0: 6 An usual eigenvector associated with  is a generalized eigenvector of order 1.

r1 rk De…nition 4 Let A be an n n matrix with characteristic p (t) = (t 1) (t k) ; where     r1 + + rk = n and i C. Then ri is the algebraic multiplicity of i; and the of the eigenspace    2 associated with i is the geometric multiplicity of i:

The geometric multiplicity of an eigenvalue is at least 1 and at most its algebraic multiplicity.

De…nition 5 An n n matrix A is defective if A has an eigenvalue whose geometric multiplicity is less than its algebraic multiplicity.

Example 6 The characteristic polynomial of the unreduced Hessenberg matrix

1 1 1 t 1 2 H = is p (t) = det = (t 2) ; 1 3 1 3 t     the algebraic multiplicity of the eigenvalue  = 2 is therefore 2: To …nd a for the corresponding eigenspace, solve the system (H 2I) x = 0 1 1 row reduce 1 1 ; 1 1 ! 0 0     1 then v = is a basis and the geometric multiplicity of the eigenvalue  = 2 is 1: Consequently H 1 1 is defective  and is not diagonalizable.

Exercise 7 Show that the following matrices are defective:

4 1 1 1 1 4 16 3 4 4 a. b. 0 1 2 7 2 2 1 3   6 11 1 3 4 7 6 7 4 5

2 Theorem 8 Let  be an eigenvalue of an unreduced n n Hessenberg matrix H: If the algebraic multiplicity  of  is m; then H has a generalized eigenvector vr of order r associated with  for each r = 1; 2; : : : ; m:

Proof. Let p (t) be the characteristic polynomial of H: Repeatedly divide p (t) by t  and until p (t) = (t )m q (t) with q () = 0: 6 Consider the polynomial

m 1 n 2 n 1 (t ) q (t) = a0 + a1t + + an 2t + t    T whose vector of coe¢ cients [a0 an 2 1] = 0: Thus    6 m 1 n 2 n 1 (H I) q (H) e1 = a0e1 + a1He1 + + an 2H e1 + H e1 = 0    6

by Corollary 2. Set vm = q (H) e1; then

m 1 (H I) vm = 0: 6 On the other hand, p (H) = (H I)m q (H) implies m m (H I) vm = (H I) q (H) e1 = p (H) e1: But p (H) = 0 by the Cayley-Hamilton Theorem, and we conclude that

m (H I) vm = 0: m 1 Therefore vm is a generalized eigenvector of order m associated with : Finally, since (H I) q (H) e1 = m 6 0 and (H I) q (H) e1 = 0; let m r vr = (H I) q (H) e1; r 1 r then (H I) vr = 0 and (H I) vr = 0. Thus H has a generalized eigenvector of order r associated with  for each r = 16; 2; : : : ; m:

Theorem 9 An unreduced n n Hessenberg matrix has n linearly independent generalized eigenvectors. 

Proof. Let 1; 2; : : : ; k be the eigenvalues of H and let mi be the algebraic multiplicity of i: Then m1 m2 mk p (t) = (t 1) (t 2) (t k) is the characteristic polynomial of H: By Theorem 8, each i has    r an associated generalized eigenvector vi of order r for each r = 1; 2; : : : ; mi: Denote these by

v1;:::; vm1 ; v1;:::; vm2 ;:::; v1;:::; vmk f 1 1 2 2 k k g 1 2 k and suppose that | {z } | {z } | {z }

a1v1 + + am1 vm1 + a1v1 + + am2 vm2 + + a1 v1 + + amk vmk = 0: (2) 1 1    1 1 2 2    2 2    k k    k k

m2 mk j Let q (t) = (t 2) (t k) ; since the factors of q (t) commute, q (H) vi = 0 for all i and all j 2. Now multiply both sides   of (2) by q (H); then 

q (H) a1v1 + + am1 vm1 + q (H) a1v1 + + am2 vm2 + + a1 v1 + + amk vmk = 0 1 1    1 1 2 2    2 2    k k    k k  0  reduces to | {z } a1q (H) v1 + + am1 q (H) vm1 = 0: (3) 1 1    1 1 p p m1 1 Keeping in mind that (t 1) q (t) = q (t)(t 1) ; multiply both sides of (3) by (H 1I) ; then

m1 m1 1 m1 a (H 1I) q (H) v = 0 (4) 1 1 =0 6 | {z } 3 m1 m1 2 implies a = 0: Now multiply both sides of (4) by (H 1I) ; then 1 1 m1 2 m 1 1 am1 1(H 1I) q (H) v1 = 0 =0 6 1 | {z } implies am1 1 = 0; and so on inductively. Do this for each i and conclude that all coe¢ cients vanish.

Example 10 Continuing Example 6, Let’s …nd a generalized eigenvector v2 associated with  = 2 such that v1; v2 is linearly independent. Solve the system f g (H 2I) x = v1 . . 1 1 . 1 row reduce 1 1 . 1 — > ; 2 . 3 2 . 3 1 1 . 1 0 0 . 0 the general solution is 4 5 4 5 1 1 t + : 1 0     1 Set t = 0 and obtain the particular solution v = : Note that 2 0   2 (H 2I) v2 = (H 2I) v1 = 0; then v2 is a generalized eigenvector of order 2 associated with  = 2: Thus we obtain two linearly independent generalized eigenvectors associated with  = 2 : 1 1 v = ; v = : 1 1 2 0      Problem: Let H be a complex n n unreduced Hessenberg matrix. Solve the IVP y0 = Hy with y (0) = y0.  Steps in the solution:

1. Using Krylov’smethod, compute the characteristic polynomial of H: Then

1 2 p (t) = (t 1) (t 2) (t k) k ;    the eigenvalue i has algebraic multiplicity i:

2. Find a maximal linearly independent set of eigenvectors u1;:::; ur : Then each uj is associated with it f g some i: Set y = e uj; then

it it it y0 = e iuj = e Huj = H e uj = Hy so that y is a particular solution. 

3. Let mi be the geometric multiplicity of i: If mi < i, …nd qi = i mi generalized eigenvectors v1; v2 :::; vq of H in the following way: Set v1 = u1 and solve the following successive sequence of f i g linear equations in which vr is known and vr+1 is unknown:

(H iI) v2 = v1

(H iI) v3 = v2

(H iI) v4 = v3 . .

(H iI) vqi = vqi 1:

4 Then qi 1 qi 2 (H iI) vqi = (H iI) vqi 1 = = (H iI) v2 = v1: (5)    Since (H iI) v1 = 0; multiplying each expression in (5) by H iI gives

qi qi 1 2 (H iI) vqi = (H iI) vqi 1 = = (H iI) v2 = (H iI) v1 = 0:   

Then v1; v2 :::; vq is linearly independent by Theorem 9 and vr is a generalized eigenvector of order f i g r associated with i for each r = 1; 2; : : : ; qi: Let

t2 tr 1 it yr = e vr + tvr 1 + vr 2 + + v1 ; 2!    (r 1)!   then t2 tr 1 tr 2 it it yr0 = ie vr + tvr 1 + vr 2 + + v1 + e vr 1 + tvr 2 + + v1 2!    (r 1)!    (r 2)!     t2 tr 2 it = e (vr 1 + ivr) + t (vr 2 + ivr 1) + (vr 3 + ivr 2) + + (v1 + iv2) 2!    (r 2)!  tr 1 +  v (r 1)! i 1  t2 tr 1 it = e Hvr + tHvr 1 + Hvr 2 + + Hv1 = Hyr 2!    (r 1)!   so that yr is a particular solution. 4. Finally, the general solution of the given system of linear di¤erential equations consists of all linear combinations of the solutions found in steps 2 and 3. Given this, one can solve a system of linear equations and …nd the particular solution that solves the IVP.

Example 11 Solve the system

y0 = y1 y2 1 y20 = y1 + 3y2 subject to the initial conditions y1 (0) = 5; y2 (0) = 7: Written in matrix form y0 = Hy the IVP becomes

y0 1 1 y1 y1 (0) 5 1 = with = : (6) y0 1 3 y2 y2 (0) 7  2          In Examples 6 and 10 we found the linearly independent generalized eigenvectors

1 1 v = and v = 1 1 2 0     associated with the eigenvalue  = 2 of the unreduced Hessenberg coe¢ cient matrix

1 1 H = : 1 3   Let 2t y = e v1; di¤erentiating we have 2t 2t y0 = e (2v1) = e Hv1 = Hy 2t so that y = e v1 is a particular solution. Now

(H 2I) v2 = v1 Hv2 = v1 + 2v2: )

5 Let 2t y = e (v2 + tv1); di¤erentiating gives

2t 2t 2t y0 = 2e (v2 + tv1) + e v1 = e [(v1 + 2v2) + t (2v1)] 2t 2t = e (Hv2 + tHv1) = He (v2 + tv1) = Hy

2t so that y = e (v2 + tv1) is a particular solution. Thus the general solution of system 6 is

2t 2t y = ae v1 + be (v2 + tv1) ; a; b R: 2

To …nd the desired particular solution, solve av1 + bv2 = y (0) : . . 1 1 . 5 row reduce 1 0 . 7 — > : 2 . 3 2 . 3 1 0 . 7 0 1 . 2 4 5 4 5 The solution of the IVP is therefore

2t 2t y = 7e v1 2e (v2 + tv1) :

11-10-2010

6