1. Review: Eigenvalue Problem for a Symmetric Matrix

1. Review: Eigenvalue Problem for a Symmetric Matrix

1. Review: Eigenvalue problem for a symmetric matrix 1.1. Inner products n For vectors u;v 2 R we define the inner product by n hu;vi := ∑ uivi i=1 and the norm by kuk := hu;ui: We call two vectors u;v orthogonal if hu;vi = 0. n Recall the Cauchy-Schwarz inequality: For any u;v 2 R jhu;vij ≤ kukkvk: (1) For a matrix A 2 n×n we have R D E hAu;vi = u;A>v where A> denotes the transpose matrix. If A = A> we call the matrix symmetric. n×n Lemma 1.1. For a symmetric A 2 R we have max jhAu;uij = max kAuk (2) kuk=1 kuk=1 Proof. Let M1 denote the value of the maximum on the left, let M2 denote the value of the maximum on the right. By Cauchy-Schwarz (1) we have for any w 2 jhAw;wij ≤ kAwkkwk ≤ M2 kwk ; (3) hence M1 ≤ M2. Next we show M2 ≤ M1: For an arbitrary u with kuk = 1 we let l := kAuk, f := lu, g := Au and consider X := hA f ;gi + hAg; f i = l hAu;Aui + l A2u;u = 2l kAuk2 = 2l 3: | {z } hAu;Aui 2 On the other hand we see by multiplying out and using hAw;wi ≤ M1 kwk that M M X = 1 hA( f + g); f + gi − 1 hA( f − g); f − gi ≤ 1 h f + g; f + gi + 1 h f − g; f − gi = M k f k2 + kgk2 = M 2l 2; 2 2 2 2 1 1 3 2 hence we have X = 2l ≤ M12l . Therefore l = kAuk ≤ M1 and hence M2 ≤ M1 (since u with kuk = 1 is arbitrary). Now consider an invariant subspace V of A. This means that u 2 V implies Au 2 V. Then we obtain by the same argument max jhAu;uij = max kAuk (4) u2V u2V kuk=1: kuk=1: 1 1.2. Eigenvalue problem n Let A be a real n × n matrix. A number l 2 C is called an eigenvalue of the matrix if there exists a nonzero vector v 2 C such that Av = lv: (5) > All eigenvalues are real: Multiplying (5) from the left by ~v (bar denotes complex conjugate) gives v>Av = lv>v. Since > > A = A we have v>Av = (Av) v = lv>v yielding l = l. Eigenvectors for different eigenvalues are orthogonal: D ( j) (k)E D ( j) (k)E D ( j) (k)E D ( j) (k)E D ( j) (k)E l j v ;v = Av ;v = v ;Av = lk v ;v ; hence (l j − lk) v ;v = 0 The central result is that we have a full number of eigenvectors: n×n Theorem 1.2. Assume that A 2 R is symmetric. Then 1. There are n real eigenvalues l1;:::;ln 2. There are corresponding real eigenvectors v(1);:::;v(n) which are orthogonal on each other. n 3. Any vector u 2 R can be written as a linear combination of eigenvectors: n u;v( j) u = c v( j); c = ∑ j j ( j) ( j) j=1 v ;v The key idea of the proof is to maximize jhAu;uij over all u with kuk = 1. The solution of this problem yields an eigenvector: n×n Lemma 1.3. Assume that A 2 R is symmetric and V is an invariant subspace, i.e., v 2 V implies Av 2 V. 1. There exists v 2 V with kvk = 1 such that jhAv;vij is maximal: jhAv;vij = M = max jhAu;uij u2V kuk=1: 2. Actually v is an eigenvector: There is l 2 R with Av = lv; jlj = M Proof. Here we are maximizing a continuous function over a compact set, so there exists such a v with kvk = 1 . Let l := hAv;vi. By (4) we have kAvk ≤ jlj and hence kAv − lvk2 = kAvk2 − 2l hAv;vi+l 2 kvk2 ≤ l 2 − 2l 2 + l 2 = 0: | {z } |{z} l 1 Now we can give the proof of Theorem 1.2: n (1) First maximize jhAu;uij over all u 2 R with kuk = 1. By Lemma 1.3 this yields a vector v and an eigenvalue l1. Define (1) the space V1 as all vectors which are orthogonal on v : n D (1)E V1 := fu 2 R j u;v = 0g: This is an invariant subspace: If u 2 V1, then Au satisfies D (1)E D (1)E D (1)E u2V1 Au;v = u;Av = l1 u;v = 0; 2 hence Au 2 V1. (2) Now we maximize jhAu;uij over all u 2 V1 with kuk = 1. By Lemma 1.3 this yields a vector v and an eigenvalue l2. (1) (2) Define the space V2 as all vectors which are orthogonal on v and v : n D ( j)E V2 := fu 2 R j u;v = 0 for j = 1;2g: This is an invariant subspace: If u 2 V2, then Au satisfies for j = 1;2 D ( j)E D ( j)E D ( j)E u2V2 Au;v = u;Av = l j u;v = 0; hence Au 2 V2. By continuing in the same way we obtain eigenvalues l1;:::;ln with jl1j ≥ jl2j ≥ ··· ≥ jlnj and eigenvectors v(1);:::;v(n) which are by construction orthogonal on each other. 2. Review: Linear System of ODEs 2.1. Initial value problem We consider a matrix of size n × n where the entries can depend on t: Let A(t) be an n × n matrix with entries ai j(t) for (0) (0) ~ i; j = 1:::n. We are given an initial vector ~u with entries ui , i = 1:::n and a right hand side function f (t) with entries fi(t), i = 1:::n. We want to find a function ~u(t) with entries ui(t), i = 1:::n such that ~u0(t) + A(t)~u(t) = ~f (t) for t ≥ 0 (6) ~u(0) =~u(0) (7) If the function ~f (t) is zero we call the problem homogeneous. If the functions ai j(t) and fi(t) are continuous for t ≥ 0, then one can show that the initial value problem (6), (7) has a unique solution for all t ≥ 0. 2.2. Homogeneous ODE with constant coefficients, symmetric matrix We now consider a special case of (6), (7): • ~f (t) =~0 (homogeneous ODE) • the matrix A is constant • the matrix A is symmetric: A = A>. Then we can solve the IVP in the following way: 1. Find special solutions of the form ui(t) = vig(t): We want to find solutions of the form ~u(t) =~vg(t) with a nonzero vector ~v and a function g(t). Plugging this into the ODE gives −g0(t) ~vg0(t) + A~vg(t) =~0 () A~v = ~v g(t) (if we assume g(t) 6= 0). Since the left hand side does not depend on t and ~v is not the zero vector this equation can only hold if −g0(t)=g(t) is equal to a constant l. 3 Therefore we have to solve an eigenvalue problem: Find l and a vector~v (not the zero vector) such that A~v = l~v ( j) Solving this eigenvalue problem gives real eigenvalues l j and eigenvectors~v for j = 1;:::;n which are orthogonal on each other: D E ~v( j);~v(k) = 0 if j 6= k: (8) g0(t) For each eigenvector ~v( j) we have now a special solution of the PDE: Since − = l we have g(t) = c e−l jt and g(t) j j our special solution is −l jt c j~ve for j = 1;2;3;::: . 2. Write the solution ~u(t) of the initial value problem as a linear combination of the special solutions: We need to find coefficients c1;:::;cn such that ¥ ( j) −l jt ~u = ∑ c j~v e (9) j=1 We find the coefficients by setting t = 0: Then the initial condition gives ¥ (0) ( j) ~u = ∑ c j~v (10) j=1 We now take the inner product of this equation with the vector ~v(k): Then the orthogonality (8) gives ~u(0);~v(k) = (k) (k) ck ~v ;~v or ~u(0);~v(k) c = (11) k ~v(k);~v(k) 2.3. Fundamental System S(t) Let~u(1)(t) denote the solution of the homogeneous ODE with initial condition~u(1)(0) = [1;0;:::;0]>, . , Let~u(n)(t) denote the solution with initial condition ~u(n)(0) = [1;0;:::;0]>. We define the n × n matrix h i S(t) := ~u(1)(t);:::;~u(n)(t) : Now the solution of the homogeneous ODE with initial condition ~u(0) = ~u(0) is obtained as a linear combination of (1) (n) (1) (0) (n) (0) ~u (t);:::;~u (t): We have ~u(t) =~u (t) · u1 + ··· +~u (t) · un , i.e., ~u(t) = S(t)~u(0) 2.4. Duhamel’s principle 2.4.1. Review: Derivative of integral We consider a function F(t) defined as an integral Z b(t) F(t) = f (s;t)ds: a(t) Then the fundamental theorem of calculus and the chain rule imply that we have for the derivative Z b(t) 0 0 0 F (t) = ft (s;t)ds + f (b(t);t) · b (t) − f (a(t);t) · a (t): (12) a(t) 4 2.4.2. General case Duhamel’s principle states: If we can solve the homogeneous initial value problem ~u0(t) + A(t)~u(t) =~0 for t ≥ s (13) ~u(s) =~u(0) (14) (here the initial condition is given at s instead of 0), then we can obtain a formula for the solution of the inhomogeneous initial value problem (6), (7) as follows: 1.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    25 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us