<<

MTH 102: Department of Mathematics and Statistics Indian Institute of Technology - Kanpur

Problem Set 5

Problems marked (T) are for discussions in Tutorial sessions.

4 ⊥ 1. Let S = {e1 + e4, −e1 + 3e2 − e3} ⊂ R . Find S .

⊥ Solution: (e1 +e4) is the set of all vectors that are orthogonal to e1 +e4. That is, the set of  1 0 0 1 0 all xT = (x , . . . , x ) such that x +x = 0. So S⊥ is the solution space of . 1 4 1 4 −1 3 −1 0 0 Apply GJE and get it.

Otherwise apply GS with {e1 + e4, −e1 + 3e2 − e3, e1, e2, e3, e4}. of the last two vectors of the orthonormal is S⊥.

2 2. Show that there are infinitely many orthonormal bases of R . cos θ − sin θ Solution: Columns of , for 0 ≤ θ < 2π, form bases of 2. Idea is that take sin θ cos θ R {e1, e2} and then counter-clockwise rotate the set by an angle θ.

3. (T) What is the projection of v = e1+2e2−3e3 on H := {(x1, x2, x3, x4): x1+2x2+4x4 = 0}?

0  2  4  0 −1 0  Solution: Basis for H:  ,  ,   . 1  0  0    0 0 −1   0  2  4  0 −1 8  Orthonormalize: w =  , w = √1  , w = √1   . 1 1 2 5  0 3 105  0    0 0 −5   0  4  16 0 8 32 The projection is hv, w iw + hv, w iw + hv, w iw =   + 0w + 20   = 1  . 1 1 2 2 3 3 −3 2 105  0 21 −63 0 −5 −20 1 2 Alternately: Let x be the projection. Then v − x is parallel to  , the normal vector of H. 0 4 As ub is the unit vector in the direction of the vector u, we get 1  1 1  16 2 2 2 32 v − x = hv, v − xiv − x = 5  . So x =   − 5   = 1  . \ \ 21 0 −3 21 0 21 −63 4 0 4 −20

n T 4. Let V be a subspace of R . Then show that dim V = n − 1 if and only if V = {x : a x = 0} for some a 6= 0.

T T Solution: Let V = {x : a x = 0} = N (A), where A = a . Since a 6= 0, we see that (A) = 1 and hence dim V = n − 1 (use dim(N (A)) + dim(col space(A)) = n). 2

Conversely, suppose that dim V = n − 1. Get an orthonormal basis {u1,..., un−1} of V and n T extend it to an orthonormal basis {u1,..., un} of R . Then V = {x : un x = 0}. 5. (T) Does there exist a real A, for which, the Row space and column space are same but the null-space and left null-space are different?

Solution: Not possible. Use the fundamental theorem of linear algebra which states that

⊥ N (A) = col space(AT ) and N (AT ) = (col space(A))⊥ . That is, same require us to have a . This further implies that the of null-spaces have to be same. Now, null-space and left-null-space are orthogonal to row and column spaces, respectively (which are same in this case). Hence, the Null-spaces are also same. 6. (T) Consider two real systems, say Ax = b and Cy = d. If the two systems have the same nonempty solution set, then, is it necessary that row space(A) = row space(C)?

Solution: Yes. Observe that they have to be systems with the same number of variables. So, the two matrices A and C have the same number of columns. If they have the unique solution then N (A) = {0} = N (C).

If it has infinite number of solutions then let Sh be the solution set of the corresponding homogeneous system Ax = 0 and Cy = 0. Thus, N (A) = N (C). So, by fundamental theorem of linear algebra, col space(AT ) = col space(CT ). That is, row space(A) = row space(C). 7. Show that the system of equations Ax = b given below

x1 + 2x2 + 2x3 = 5

2x1 + 2x2 + 3x3 = 5

3x1 + 4x2 + 5x3 = 9 has no solution by finding y ∈ N (AT ) such that yT b 6= 0.

Solution: Note that if the system has a solution x0 then, we get Ax0 = b. Thus, for any y ∈ N (AT ), we have

T T T T T T y b = y (Ax0) = (y A)x0 = (A y) x0 = 0 b = 0. (1) −1 5 T   But, it is easy to check that −1 is in N (A ) and −1 −1 1 5 = −1. A contradiction 1 9 to Equation (1). Thus, the given system has no solution. 8. (T) Suppose A is an n by n real . Describe the subspace of the row space of A which is orthogonal to the first column of A−1. Solution: Let A[:, j] (respectively, A[i, :]) denote the j-th column (respectively, the i-th row) −1 −1 of A. Then, AA = In implies hA[i, :],A [:, 1]i = 0 for 2 ≤ i ≤ n. So, the row subspace of A which is orthogonal to the first column of A−1 equals LS(A[2, :],A[3, :],...,A[n, :]). 3

9. (T) Let An×n be any matrix. Then, the following statements are equivalent. (i) A is unitary. n (ii) For any orthonormal basis {u1,..., un} of C , the set {Au1,...,Aun} is also an or- thonormal basis.

∗ Solution: (i) ⇒ (ii): Suppose A is unitary. Then hAui,Auji = hui,A Auji = hui, uji. It n follows that {Au1,...,Aun} is orthonormal, hence a basis of C .

(ii) ⇒ (i): Suppose (ii) is satisfied by A. Consider the standard basis {e1,..., en}. By hypothesis {Ae1,...,Aen} is an orthonormal basis. That is the columns of A form an or- thonormal basis, that is, A∗A = I.

10. Let V be an and S be a nonempty subset of V. Show that (i) S ⊂ (S⊥)⊥. ⊥ ⊥ (ii) If V is finite dimensional and S is a subspace then (S ) = S. ⊥ ⊥ (iii) If S ⊂ T ⊂ V, then S ⊃ T . (iv) If S is a subspace then S ∩ S⊥ = {0}.

Solution: (i) x ∈ S ⇒ hw, xi = 0,for all w ∈ S⊥ ⇒ x ⊥ S⊥ ⇒ x ∈ (S⊥)⊥. (ii) If S = {0}, V we have nothing to show. So let S 6= {0}, V. Take a basis of S, apply GS to get an orthonormal basis {u1,..., uk} of S. Extend that to an orthonormal basis ⊥ {u1,..., uk, w1,..., wm} of V. It is easy to show that wi ∈ S . ⊥ ⊥ P P ⊥ ⊥ Now let x ∈ (S ) ⊂ V. Thus x = αiui + βjwj, for some αi, βj ∈ C. As x ∈ (S ) , ⊥ we have hx, yi = 0, for all y ∈ S . In particular hx, wji = 0,for all j. Thus βj = 0, for all j. P Thus x = αiui ∈ S. (iii) Obvious. (iv) Let x ∈ S ∩ S⊥. Then x ⊥ S. In particular hx, xi = 0. Thus x = 0.

P 2 11. Let A1, ··· ,Ak be k real symmetric matrices of order n such that Ai = 0. Show that each Ai = 0.

n Solution: For each x ∈ R we have

T X 2 X T 2 X T T X 2 0 = x Ai x = x Ai x = x Ai Aix = kAixk .

Hence, Aix = 0 for each i and for each x. In particular, Aie1 = 0,Aie2 = 0,...,Aien = 0 ⇒ Ai = 0.

12. Let V be a normed linear space and x, y ∈ V. Is it true that kxk − kyk ≤ kx − yk? 13. (T) Polar Identity: The following identity holds in an inner product space.

• Complex IPS : 4hx, yi = kx + yk2 − kx − yk2 + ikx + iyk2 − ikx − iyk2. 4

• Real IPS : 4hx, yi = kx + yk2 − kx − yk2

Solution: We see that kx + yk2 = hx, xi + hx, yi + hy, xi + hy, yi, kx − yk2 = hx, xi − hx, yi − hy, xi + hy, yi ikx + iyk2 = ihx, xi + ihx, iyi + ihiy, xi + hiy, iyi and ikx − iyk2 = ihx, xi − ihx, iyi − ihiy, xi + hiy, iyi. Hence

kx + yk2 − kx − yk2 + ikx + iyk2 − ikx − iyk2 = 2hx, yi + 2hy, xi + 2ihx, iyi + 2ihiy, xi = 2hx, yi + 2hy, xi − 2i2hx, yi + 2i2hy, xi = 4hx, yi.

14. Just for knowledge, will NOT be asked Let k · k be a norm on V. Then k · k is induced by some inner product if and only if k · k satisfies the parallelogram law:

kx + yk2 + kx − yk2 = 2kxk2 + 2kyk2.

Solution: See the appendix in my notes.

15. Show that an orthonormal set in an inner product space is linearly independent.

n P Solution: Let S be an orthonormal set and suppose that αixi = 0, for some xi ∈ S. i=1 n P Then αi = hxi, αjxji = hxi, 0i = 0, for each i. Thus, S is linearly independent. j=1 16. Let A be unitarily equivalent to B (that is A = U ∗BU for some unitary matrix U). Then P 2 P 2 |aij| = |bij| . ij ij Solution: We have

X 2 ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ ∗ X 2 |aij| = tr(A A) = tr(U B UU BU) = tr(U B BU) = tr(B BU U) = tr(B B) = |bij| . ij ij

17. For the following questions, find a P that projects b onto the column space of A, that is, P b ∈ col(A) and b − P b is orthogonal to col(A). 1 0 0 1  1 −2 4   1  0 1 0 2  1 −1 1   1  (i) A =   , b =   (ii) A =   , b =  . 0 0 1 3  1 1 1   1  0 0 0 4 1 2 4 0

4 Solution: Note that an orthonormal basis of col(A) is given by {e1, e2, e3} ⊂ R . Hence, the projection matrix equals

1 0 0 0 T T T 0 1 0 0 P = e1e + e2e + e3e =   . 1 2 3 0 0 1 0 0 0 0 0 5

For the second question, we know w = 1 (1, 1, 1, 1)T , w = √1 (−2, −1, 1, 2)T and w = 1 2 2 10 3 1 T 2 (1, −1, −1, 1) form an orthonormal basis of col(A). Thus, the projection matrix equals

 9 2 −2 1  T T T 1  2 6 4 −2 P = w1w + w2w + w3w =   . 1 2 3 10 −2 4 6 2  1 −2 2 9

Alternate:

1 0 0  1 0 0−1 1 0 0 0 1 0 0 0 0 1 0 0 1 0 P = A(AT A)−1AT =    0 1 0 0   0 1 0 0 0 0 1   0 0 1      0 0 1 0   0 0 1 0 0 0 0 0 0 0

1 0 0 −1 1 0 0 0 1 0 0 1 0 0 0 0 1 0 0 1 0 0 =   0 1 0 0 1 0 0 =   . 0 0 1     0 0 1 0   0 0 1 0 0 1 0   0 0 0 0 0 0 0

Similarly, it can be verified that

 9 2 −2 1  T −1 T 1  2 6 4 −2 P = A(A A) A =   . 10 −2 4 6 2  1 −2 2 9

18. We are looking for the parabola y = c + dt + et2 that gives the least squares fit to these four measurements: y = 1 at t = −2, y = 1 at t = −1, y = 1 at t = 1 and y = 0 at t = 2.

(a) Write down the four equations (Ax = b) for the parabola c + dt + et2 to go through the given four points. Prove that Ax = b has no solution.  1 −2 4   1   1 0 0 0   1 −1 1   1   0 1 0 0  Solution: Verify: A =   , b =   and RREF ([A b] =  .  1 1 1   1   0 0 1 0  1 2 4 0 0 0 0 1  c  (b) For finding a least square fit of Ax = b, i.e., of A  d  = b, what equations would you e solve? Solution: Let y = Ax − b be the error vector. Then, the sum of squared errors equals

T T T T T T T f(x1, x2, x3) = y y = (Ax − b) (Ax − b) = x A Ax − 2x A b + b b. 6

Thus, differentiating w.r.t x1, x2 and x3, we get  ∂f ∂f ∂f T , , = 2AT Ax − 2AT b. ∂x1 ∂x2 ∂x3  c  Now, equating it to zero gives AT Ax = AT b. Thus, we want to solve AT A  d  = AT b. e (c) Compute AT A. Compute its . Compute its inverse.  4 0 10  Solution: AT A =  0 10 0  , det(AT A) = 4 × 10 × 34 − 10 × 10 × 10 = 360 and 10 0 34 T −1 1 T (A A) = det(AT A) C , where C (cofactor matrix, symmetric in this case) is given by:  340 0 −100  C =  0 36 0  . −100 0 40

(d) Now, determine the parabola y = c + dt + et2 that gives the least squares fit. Solution: Using the previous two parts, we see that c  35  1 d = (AT A)−1AT b = −6 .   30   e −5

(e) The first two columns of A are already orthogonal. From column 3, subtract its pro- jection onto the of the first two columns to get the third orthogonal vector v. Normalize v to find the third orthonormal vector w3 from Gram-Schmidt. Solution: Since third and second columns are already orthogonal, suffices to subtract from the third column its projection onto the first column: 5 vT = (4, 1, 1, 4) − (1, 1, 1, 1) = (3/2, −3/2, −3/2, 3/2). 2 To find w3, just divide v by its length, 3. So,

w3 = (1/2, −1/2, −1/2, 1/2).

c (f) Now compute x = A d to verify that x is indeed the projection vector onto the column e space of the matrix A. Solution: Verify that for the value of P computed in the previous problem which corresponds to w3 in the previous part, we have 1 xT = 9 12 8 1 = (P b)T . 10