MATH 2030: MATRICES Introduction to Linear Transformations We Have
Total Page:16
File Type:pdf, Size:1020Kb
MATH 2030: MATRICES Introduction to Linear Transformations We have seen that we may describe matrices as symbol with simple algebraic properties like matrix multiplication, addition and scalar addition. In the particular case of matrix-vector multiplication, i.e., Ax = b where A is an m × n matrix and x; b are n×1 matrices (column vectors) we may represent this as a transformation on the space of column vectors, that is a function F (x) = b , where x is the independent variable and b the dependent variable. In this section we will give a more rigorous description of this idea and provide examples of such matrix transformations, which will lead to the idea of a linear transformation. To begin we look at a matrix-vector multiplication to give an idea of what sort of functions we are working with 21 0 3 1 A = 2 −1 ; v = : 4 5 −1 3 4 then matrix-vector multiplication yields 2 1 3 Av = 4 3 5 −1 We have taken a 2 × 1 matrix and produced a 3 × 1 matrix. More generally for any x we may describe this transformation as a matrix equation y 21 0 3 2 x 3 x 2 −1 = 2x − y : 4 5 y 4 5 3 4 3x + 4y From this product we have found a formula describing how A transforms an arbi- 2 3 trary vector in R into a new vector in R . Expressing this as a transformation TA we have 2 x 3 x T = 2x − y : A y 4 5 3x + 4y From this example we can define some helpful terminology. A transformation1 T from Rn to Rm is a rule that assigns to each each vector v 2 Rn a unique vector T (v) 2 Rm. The domain of T is Rn and the codomain is Rm, and we write this as T : Rn ! Rm . For a vector v in the domain of T, the vector in the codomain T (v) is called the image of v under T . The set of all possible images T (v) for all vinRn 2 is called the range of T. In the previous example the domain of TA is R and the 1or mapping or function 1 2 MATH 2030: MATRICES 2 1 3 1 codomain is 3, so T : 2 ! 3. The image of v = is w = T (v) = 3 . R A R R −1 4 5 −1 The image of TA consists of all vectors in the codmain of the form 213 2 0 3 x T = x 2 + y −1 A y 4 5 4 5 3 4 this describes an arbitrary linear combination of the column vectors of A. We con- clude that the image consists of the column space of A. Geometrically we may see this as a plane in R3 through the origin with the column vectors of A as direction 3 2 vectors. Notice that TA(x) ⊂ R where x is any vector in R Linear Transformations. The previous example TA is a special case of a more general type of transformation called a linear transformation. We provide a less rigorous definition, that summarizes the key ideas that the transformation "respect" vector operations of addition and scalar multiplication. Definition 0.1. A transformation T : Rm ! Rm is called a linear transforma- tion if (1) T (u + v) = T (u) + T (v) for all u and v in Rn. (2) T (cv) = cT (v) for all v in Rn and all scalars c. Example 0.2. Consider once again the transformation T : R2 ! R3 defined by 2 x 3 x T = 2x − y y 4 5 3x + 4y x w we will show this is indeed a linear transformation. Define u = and v = y z then compute T (u + v), 2 x + w 3 2 x + w 3 x w x + w T + = T = 2(x + w) − 3(y + z) = (2x − 3y) + (2w − 3z) y z y + z 4 5 4 5 3(x + w) + 4(y + z) (3x + 4y) + (3w + 4z) Looking at the far-right hand side we may write this as 2 x 3 2 w 3 x w (2x − 3y) + (2w − 3z) = T + T = T (u) + T (v) 4 5 4 5 y z (3x + 4y) (3w + 4z) To show the second property, consider T (cv) for some scalar c: 2 cx 3 2 cx 3 2 x 3 x cx x T c = T = 2cx − cy = c(2x − y) = c 2x − y = c : y cy 4 5 4 5 4 5 y 3cx + 4cy c(3x + 4y) 3x + 4y the second property holds, this is indeed a linear transformation. Although the linear transformation T in the previous example arose as a matrix transformation TA, one may go backwards and recover the matrix A from the MATH 2030: MATRICES 3 definition of T given in the example. Notice that 2 x 3 213 2 0 3 21 0 3 x x T = 2x − y = x 2 + y −1 = 2 −1 y 4 5 4 5 4 5 4 5 y 3x + 4y 3 4 3 4 where this is just the matrix-vector multiplication of A with an arbitrary vector in the domain. In general a matrix transformation is equivalent to a linear transfor- mation, according to the next theorem Theorem 0.3. Let A be an m × n matrix. Then the matrix transformation TA : Rn ! Rm defined by n TA(x) = Ax; x 2 R is a linear transformation. Proof. Let u and v be vectors in the domain, and c a scalar, then TA(u + v) = Au + Av = TA(u) + TA(v) and TA(cv) = cAv = cTA(v). Thus TA is a linear transformation. Example 0.4. Q: Let F : R2 ! R2 be the transformation that sends each point to its reflection in the x-axis. Show that F is a linear transformation. A: This transformations send each point (x; y) to a new coordinate (x; −y), and so x x we may write F = To show this is linear notice that y −y x 1 0 1 0 x = x + y = −y 0 −1 0 −1 y Thus F x = Ax showing that this is a matrix transformation and hence a linear transformation by the previous theorem. Example 0.5. Q:Let R : R2 ! R2 be the transformation that rotates each point by an angle of π=4 (90 degrees) counterclockwise about the origin. Show that F is a linear transformation. A: Plotting this on the plane, we see that R takes any point (x; y) in the plane and sends it too (−y; x), and so as a transformation x −y 0 −1 0 −1 x R = = x + y = y x 1 0 1 0 y So R is described by a matrix transformation and therefore is a linear transforma- tion. Recalling that if we multiply a matrix by standard basis vectors we find the columns of the original matrix, we can use this fact to show that every linear transformation from Rn to Rm arises as a matrix transformation. Theorem 0.6. Let T: Rn ! Rm be a linear transformation. Then T is a matrix transformation, and more specifically T = TA where A is the m × n matrix A = [T (bfe1)jT (e2)j · · · jT (en)]: n Proof. Let e1; e2; :::; en be the standard basis vectors in R and let x be a vector n 1 n in R , so that x = x e1 + ::: + x en. Noting that T (ei) for i = 1; :::; n are column 4 MATH 2030: MATRICES m vectors in R , we denote A = [T (bfe1)jT (e2)j · · · jT (en)] be the m × n matrix with these vectors as its columns, then 2x1 3 1 n 6 . 7 T (x) = T (x e1 + ::: + x en) = [T (bfe1)jT (e2)j · · · jT (en)] 4 . 5 = Ax: xn The matrix in the proof of the last theorem is called the standard matrix of the linear transformation T. Example 0.7. Q: Show that a rotation about the origin through an angle θ defines a 2 2 linear transformation from R to R and find its standard matrix. A: Let Rθ be the rotation, we will prove this geometrically. Let u and v be vectors in the plane, then the parallelogram rule determines the new vector u + v . If we now apply Rθ the parallelogram is rotated by an angle of θ and so the diagonal of the parallelogram defined by Rθ(u) + Rθ(v) Hence Rθ(u + v) = Rθ(u) + Rθ(v). Similarly if we apply a rotation to v and cv by a fixed angle of θ we find Rθ(v) and Rθ(cv), however as rotations do not affect lengths we must have Rθ(cv) = cRθ(v). We conclude that Rθ is a linear transformation, and we may apply the standard basis vectors of R2 to this transformation to determine its standard matrix. Using trigonometry we find that 1 cosθ R = : 0 sinθ Equivalently we find that the second standard basis vector is mapped to 0 −sinθ R = : 1 cosθ Thus the standard matrix for Rθ will be cosθ −sinθ sinθ cosθ Example 0.8. • Show that the transformation P:R2 ! R2 that projects a point onto the x-axis is a linear transformation and find its standard matrix. • More generally, if ` is a line through the origin in R2, show that the transfor- 2 2 mation P` : R ! R that projects a point onto ` is a linear transformation and find its standard matrix. A: • P sends the point (x; y) to the point (x; 0) and so x x 1 0 1 0 x P = = x + y = y 0 0 0 0 0 y 1 0 Thus the transformation matrix for P is just .