
(III.B) Linear transformations in terms of matrices We have presented linear transformations in §III.A independently of matrices to emphasize the fact that, like vectors, they are intrinsic objects that exist independent of a basis. However, just as it is useful to write vectors ~v 2 V in terms of their coordinates with respect to a given basis B = f~v1,...,~vng for V , viz. 0 1 b1 B . C [~v]B = @ . A if ~v = b1~v1 + ... + bn~vn , bn it is also computationally convenient to summarize transformations in terms of their matrix of coef ficients with respect to two given bases. Namely, given T : V ! W , and bases B = f~v1,...,~vng , C = fw~ 1,..., w~ mg for V and W respectively, we write 0 b ··· b 1 11 1n m B . .. C C [T]B = @ . A if T~vj = ∑ bijw~ i (j = 1, . , n). i=1 bm1 ··· bmn Since 0 1 b1j B . C [T~vj]C = @ . A , bmj we can rewrite this matrix 0 "" 1 B C C [T]B = @ [T~v1]C ··· [T~vn]C A . ## 1 2 (III.B) LINEAR TRANSFORMATIONS IN TERMS OF MATRICES In this form clearly 0 1 0 0 1 B . C "" B . C B C B C B C th C [T]B[~vj]B = @ [T~v1]C ··· [T~vn]C A B 1 C j place B C ## B . C @ . A 0 0 " 1 B C = @ [T~vj]C A , # and so by linear extension to any ~v 2 V C [T]B[~v]B = [T~v]C. An important special case is that of an endomorphism T : V ! V where B = C . We will frequently write [T]B instead of B[T]B in this case. d EXAMPLE 1. T = dx on V = P3 = polynomials of degree ≤ 3 , that is, d : P ! P . dx 3 3 It sends 1 7! 0, x 7! 1, x2 7! 2x, x3 7! 3x2, so in terms of B = f1, x, x2, x3g we have 0 1 0 1 """" 0 1 0 0 d h i h i h i h i B 0 0 2 0 C = B d d d 2 d 3 C = B C B dx 1 dx x dx x dx x C B C . dx B @ B B B B A @ 0 0 0 3 A #### 0 0 0 0 Change of basis for transformations. Recall from §II.E that given 0 0 0 two bases A = f~u1,...,~u`g, A = f ~u1,..., ~u`g for a vector space U , we have PA!0A[~u]A = [~u]0A (III.B) LINEAR TRANSFORMATIONS IN TERMS OF MATRICES 3 for all ~u 2 U. To apply this to transformations, first let’s consider a 0 change C! C : what is the new 0C [T]B , in terms of C [T]B ? Since C [T]B[~v]B = [T~v]C = P0C!C · [T~v]0C , −1 (PC!0C · C [T]B) [~v]B = (P0C!C ) C [T]B[~v]B = [T~v]0C taking 0C [T]B to be PC!0C ·C [T]B gives 0C [T]B[~v]B = [T~v]0C , which is what we want. On the other hand, for B! 0B , writing [T~v]C = C [T]B[~v]B = C [T]B (P0B!B · [~v]0B) , we see that writing C [T]0B = C [T]B P0B!B gives C [T]0B[~v]0B = [T~v]C. So for B! 0B and C! 0C , 0C [T]0B = (PC!0C ) ·C [T]B · (P0B!B) where e.g. PC!0C is the matrix with columns = the vectors of C writ- ten in terms of the 0C -basis. For T : V ! V with C = B (coefficients with respect to one basis), if we change B! 0B that means “on both sides”: 0B[T]0B = (PB!0B) ·B [T]B · (P0B!B) . In words: “if we know [T]B then we can transform [~v]0B to [~v]B , 0 apply [T]B to get [T~v]B , and transform the result back to the B - basis to get [T~v]0B .” What we’ll usually know is P = P0B!B , i.e. the coordinates the vectors of a new basis 0B in terms of the old basis B (in terms of which we also have [T] ). (For instance, if B is the n standard basis of V = R , then P = P0B in the notation of §II.E.) In brief, then, −1 [T]0B = P [T]B P. The difficulty is always in getting the P ’s the right way around! 4 (III.B) LINEAR TRANSFORMATIONS IN TERMS OF MATRICES DEFINITION 2. Two m × m matrices M1 and M2 are said to be similar iff they are related by an invertible S via −1 M1 = S M2S. It’s natural to think of similar matrices as “representing the same endomorphism with respect to different bases”. h d i EXAMPLE 3. Let’s change dx from the previous example to h i B d where 0B = f1, x + t, (x + t)2, (x + t)3g. First of all we can dx 0 B h i write down d directly: dx 0B 0 1 0 1 0 1 0 0 """" B C B h i h i h i h i C B 0 0 2 0 C B d 1 d (x + t) d (x + t)2 d (x + t)3 C = B C @ dx 0B dx 0B dx 0B dx 0B A B 0 0 0 3 C #### @ A 0 0 0 0 h i which (surprise!) is the same matrix as d . Let’s see if the dx B −1 P [T]B P = [T]0B formula gives the same result. First we compute 0 1 1 t t2 t3 0 """" 1 B 2 C 2 3 B 0 1 2t 3t C P0B!B = B [1] [x + t] (x + t) (x + t) C = B C , @ B B B B A B 0 0 1 3t C #### @ A 0 0 0 1 and taking rre f (P0B!B jI4 ) to get (I4 jPB!0B ) gives 0 1 −t t2 −t3 1 B − 2 C B 0 1 2t 3t C PB!0B = B C . @ 0 0 1 −3t A 0 0 0 1 As you’ll readily verify, multiplying out the matrices d d PB!0B · · P0B!B to get dx B dx 0B h i h i does give the right answer (that d and d are the same ma- dx B dx 0B trix). Since the last example (wow, a matrix similar to itself) could be considered somewhat disappointing, here is one more. (III.B) LINEAR TRANSFORMATIONS IN TERMS OF MATRICES 5 EXAMPLE 4. We write the matrix (with respect to the standard basis) for the transformation T : R3 ! R3 rotating thru an angle t q about the axis spanned by ~v1 = (1, 2, 2). To do this, first ex- pand this to an orthogonal basis B = f~v1,~v2,~v3g by choosing ~v2 = t t (−2, 2, −1) and ~v3 = (−2, −1, 2). (Note that these also all have the same length.) Then 0 1 0 1 0 1 2 2 1 1 −2 −2 1 0 0 9 9 9 [T] = S[T] S−1 = B − C B − C B − 2 2 − 1 C eˆ B @ 2 2 1 A @ 0 cos q sin q A @ 9 9 9 A 2 1 2 2 −1 2 0 sin q cos q − 9 − 9 9 0 1 1 + 8 cos q 2 − 2 cos q − 6 sin q 2 − 2 cos q + 6 sin q 1 = B 2 − 2 cos q + 6 sin q 4 + 5 cos q 4 − 4 cos q − 3 sin q C 9 @ A 2 − 2 cos q − 6 sin q 4 − 4 cos q + 3 sin q 4 + 5 cos q is a big ugly mess. But our technology made it easy to compute. ◦ In particular, rotation by 90 about eˆ1 resp. ~v1 are given by 0 1 0 1 1 0 0 1 −4 8 1 M = B 0 0 −1 C resp. M0 = B 8 4 1 C , @ A 9 @ A 0 1 0 −4 7 4 and we note that 0 1 0 1 1 −4 8 1 8 4 1 1 MM0 = B 4 −7 −4 C 6= B 8 1 −4 C = M0 M. 9 @ A 9 @ A 8 4 1 −4 4 −7 This illustrates the fact that while rotations in R2 commmute, rota- tions in R3 do not unless the axes are the same.1 Rank + Nullity revisited. We have shown how to associate to any linear transformation T : V ! W (plus two given bases) a ma- trix; conversely any matrix does give a linear transformation (with respect to the given bases). Due to this equivalence we should be able to write theorems about transformations as theorems about ma- trices. According to Exercise III.A.1 you can essentially think of V and W as Rn and Rm with the standard bases without any loss of 1You can even see this with 90◦ rotations about the x and y axes (try it with a sheet of paper!). 6 (III.B) LINEAR TRANSFORMATIONS IN TERMS OF MATRICES generality; for simplicity we will now assume this setting: the “ma- trix B of T ” will just mean eˆ[T]eˆ . From this point of view I want to give a concrete proof (using rre f ) of rank+nullity. First note that all vectors “in the image of” T : Rn ! Rm are lin- ear combinations of the columns of B : so dim(im(T)) = dim(Vcol(B)) .
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages12 Page
-
File Size-