Algebra of Linear Transformations and Matrices Math 130 Linear Algebra
Total Page:16
File Type:pdf, Size:1020Kb
Then the two compositions are 0 −1 1 0 0 1 BA = = 1 0 0 −1 1 0 Algebra of linear transformations and 1 0 0 −1 0 −1 AB = = matrices 0 −1 1 0 −1 0 Math 130 Linear Algebra D Joyce, Fall 2013 The products aren't the same. You can perform these on physical objects. Take We've looked at the operations of addition and a book. First rotate it 90◦ then flip it over. Start scalar multiplication on linear transformations and again but flip first then rotate 90◦. The book ends used them to define addition and scalar multipli- up in different orientations. cation on matrices. For a given basis β on V and another basis γ on W , we have an isomorphism Matrix multiplication is associative. Al- γ ' φβ : Hom(V; W ) ! Mm×n of vector spaces which though it's not commutative, it is associative. assigns to a linear transformation T : V ! W its That's because it corresponds to composition of γ standard matrix [T ]β. functions, and that's associative. Given any three We also have matrix multiplication which corre- functions f, g, and h, we'll show (f ◦ g) ◦ h = sponds to composition of linear transformations. If f ◦ (g ◦ h) by showing the two sides have the same A is the standard matrix for a transformation S, values for all x. and B is the standard matrix for a transformation T , then we defined multiplication of matrices so ((f ◦ g) ◦ h)(x) = (f ◦ g)(h(x)) = f(g(h(x))) that the product AB is be the standard matrix for S ◦ T . while There are a few more things we should look at (f ◦ (g ◦ h))(x) = f((g ◦ h)(x)) = f(g(h(x))): for matrix multiplication. It's not commutative. It is associative. It distributes with matrix addi- They're the same. tion. There are identity matrices I for multiplica- Since composition of functions is associative, and tion. Cancellation doesn't work. You can compute linear transformations are special kinds of func- powers of square matrices. And scalar matrices. tions, therefore composition of linear transforma- tions is associative. Since matrix multiplication Matrix multiplication is not commutative. corresponds to composition of linear transforma- It shouldn't be. It corresponds to composition of tions, therefore matrix multiplication is associative. linear transformations, and composition of func- An alternative proof would actually involve tions is not commutative. computations, probably with summation notation, Example 1. Let's take a 2-dimensional geometric something like example. Let T be rotation 90◦ clockwise, and S be ! X X reflection across the x-axis. We've looked at those aij bjkckl before. The standard matrices A for S and B for j k T are X = aijbjkckl 1 0 A = j;k 0 −1 ! X X 0 −1 = aijbjk ckl: B = 1 0 k j 1 Matrix multiplication distributes over ma- AI = A = IA are different. For example, trix addition. When A, B, and C are the right 2 1 0 0 3 shape matrices so the the operations can be per- 4 5 6 0 1 0 formed, then the the following are always identities: 3 −1 0 4 5 0 0 1 4 5 6 A(B + C) = AB + AC = 3 −1 0 (A + B)C = AC + BC 1 0 4 5 6 = : Why does it work? It suffices to show that it works 0 1 3 −1 0 for linear transformations. Suppose that R, S, and T are their linear transformations. The correspond- Cancellation doesn't work for matrix multi- ing identities are plication! Not only is matrix multiplication non- commutative, but the cancellation law doesn't hold R ◦ (S + T ) = (R ◦ S) + (R ◦ T ) for it. You're familiar with cancellation for num- (R + S) ◦ T = (R ◦ T ) + (S ◦ T ) bers: if xy = xz but x 6= 0, then y = z. But we can come up with matrices so that AB = AC and 1 0 Simply evaluate them at a vector v and see that A 6= 0, but B 6= C. For example A = , 0 0 you get the same thing. Here's the first identity. 1 0 1 0 You'll need to use linearity of R at one point. B = , and C = . 0 3 0 4 (R ◦ (S + T ))(v) = R((S + T )(v) Powers of matrices. Frequently, we'll multiply = R(S(v + T (v) square matrices by themselves (you can only mul- = R(S(v)) + R(T (v)) tiply square matrices by themselves), and we'll use ((R ◦ S) + (R ◦ T ))(v) = (R◦S)(v) + (R◦T )(v) the standard notation for powers. The expression = R(S(v)) + R(T (v)) Ap stands for the product of p copies of A. Since matrix multiplication is associative, this definition works, so long as p is a positive integer. But we can The identity matrices. Just like there are ma- extend the definition to p = 0 by making A0 = I, trices that work as additive identities (we denoted and the usual properties will will still hold. That is, them all 0 as described above), there are matrices ApAq = Ap+q and (Ap)q = Apq. Later, we'll extend that work as multiplicative identities, and we'll de- powers to the case when A is an invertible matrix note them all I and all them identity matrices. An and the power p is a negative integer. identity matrix is a square n by n matrix with 1 Warning: because matrix multiplication is not down the diagonal and 0 elsewhere. You could de- commutative in general, it is usually the case that note them In to emphasize their sizes, but you can (AB)p 6= ApBp. always tell by the context what its size is, so we'll leave out the index n. By the way, whenever you've Scalar matrices. A scalar matrix is a matrix got a square n by n matrix, you can say the order with the scalar r down the diagonal. That's the of the matrix is n. Anyway, I acts like an identity same thing as the scalar r times the identity ma- matrix trix. For instance, AI = A = IA: 24 0 03 21 0 03 Note that if A is not a square matrix, then the 40 4 05 = 4 40 1 05 = 4I: orders of the two identity matrices I in the identity 0 0 4 0 0 1 2 Among other things, that means that we can iden- tify a scalar matrix with the scalar. Math 130 Home Page at http://math.clarku.edu/~djoyce/ma130/ 3.