
B F p - F n @ @ @ A AB @ @R ? Composition of linear transformations F m and matrix multiplication Let's see what the entries in the matrix product Math 130 Linear Algebra AB have to be. D Joyce, Fall 2013 Let v be a vector in F p, then w = T (v) is a vector in F n, and x = S(w) = (S ◦ T )(v) is a Throughout this discussion, F refers to a fixed vector in F m. field. In application, F will usually be R. V , W , The n × p matrix B represents S. Its jkth entry and X will be vector spaces over F . is B , and it was defined so that for each j, Consider two linear transformations V !T W and jk S X W ! X where the codomain of one is the same as w = B v : the domain of the other. Their composition V −!S◦T j jk k k X is illustrated by the commutative diagram T - Likewise, the m × n matrix A represents T . Its VW th @ ij entry is Aij, and it was defined so that for each @ i, @ S X S ◦ T @ xi = Aijwj: @R ? j X Therefore As each of T and S preserve linear combinations, ! so will the composition, so S ◦ T is also a linear X X X X xi = Aij Bjkvk = AijBjk vk: transformation. j k k j Coordinates again. When the vector spaces are Definition 1. Given an m × n matrix A and an coordinatized, that is, when we have chosen a ba- n × p matrix B, we define AB to be an m × p th sis β for V , γ for W , and δ for X, we have iso- matrix whose ik entry is ' p ' n morphisms φβ : V ! F , φγW ! F , and X ' m (AB)ik = AijBjk: φδ : X ! F . Although we could do everything explicitly with j these isomorphisms, they really get in the way of With this definition, matrix multiplication corre- understanding. So instead, let's just assume that sponds to composition of linear transformations. the vector spaces actually are F p, F n, and F m, and we have two linear transformations T : F p ! F n A mnemonic for multiplying matrices. Al- and S : F n ! F m. though the equation (AB) = P A B is fine Then F p !T F n is represented by an n×p matrix ik j ij jk for theoretical work, in practice you need a better n S m B, F ! F is represented by a m × n matrix A, way to remember how to multiply matrices. p S◦T m and their composition F −! F is represented by The entry Aij in a row of the first matrix needs to some m × p matrix. We'll define matrix multipli- be multiplied by the corresponding Bjk in a column cation so that the product of the two matrices AB of the second matrix. If you place the matrix A to represents the composition S ◦ T . the left of the product and place the matrix B above 1 the product, it's easier to see what to multiply by and let b be the constant matrix (a column vector) what. for this system, so that Take, for instance, the following two 3 by 3 ma- 2 3 trices. 12 b = 5 : 2 3 2 3 4 5 4 5 6 2 1 1 5 A = 43 −1 0 5 ;B = 4 0 4 55 2 0 −2 −2 −3 0 Finally, let x be the variable matrix for this system, that is, a matrix (another column vector) with the Think of A as being made of three row vectors and variables as its entries, so that B as being made of three column vectors. x x = : 24 5 6 3 2 2 1 13 y A = 43 −1 0 5 ;B = 4 0 4 55 2 0 −2 −2 −3 0 Then the original system of equations is described by the matrix multiplication Ax = b: 2 2 1 1 3 2 5 2 3 2 12 3 4 0 4 5 5 x 3 −1 = 5 −2 −3 0 4 5 y 4 5 1 3 5 2 4 5 6 3 2 −4 6 29 3 In general, each system of linear equations corre- 4 3 −1 0 5 4 6 −1 −2 5 sponds to a single matrix equation 2 0 −2 8 8 2 To get an entry for the product, work with the row Ax = b in A to the left of it and the column of B above it. For example, the upper left entry of the product, where A is the matrix of coefficients in the sys- work with the first row of A and the first column tem of equations, x is a vector of the variables in of B; you'll get 4 · 2 + 5 · 0 + 6 · (−2) = −4. the equations, and b is a vector of the constants in the equations. This interpretation allows us to interpret something rather complicated, namely a Systems of linear equations are linear matrix whole system of equations, as a single equation. equations. We'll have a lot of uses for matrix multiplication as the course progresses, and one of the most important is the interpretation of a system Matrix products in Matlab. If A and B are of linear equations as a single matrix equation. two matrices of the right size, that is, A has the Take, for example, the system of equations same number of columns that B has rows, then the expression A*B gives their product. You can com- 5x + 2y = 12 pute powers of square matrices as well. If A is a square matrix, then A^3 computes the same thing 3x − y = 5 as A*A*A. x + 3y = 5 Let A be the coefficient matrix for this system, so Categories. Categories are higher order alge- that braic structures. We'll look at a couple of cate- 2 5 2 3 gories. One will be the category of vector spaces A = 4 3 −1 5 ; and linear transformations over a field, the other 1 3 the category of matrices over a field F . We'll also 2 f consider the category of sets, but primarily just as 6. for all A ! B, f ◦ 1A = f and 1B ◦ f = f. another example of categories. These compositions are illustrated by the two Mathematics abounds with categories. There are commutative diagrams categories of topological spaces, of differentiable f spaces, of groups, of rings, etc. A AB- The purpose of a category is to study the inter- @ @ @ f ◦ 1A @ relations of its objects, and to do that the category 1A @ @ 1B @ 1B ◦ f @ includes `morphisms' (also called maps or arrows) ? @R @R ? between the objects. In the case of the category of AB- B vector spaces, the morphisms are the linear trans- f formations. f g h We'll start with the formal definition of cate- 7. for all A ! B, B ! C, and C ! D,(h ◦ g) ◦ gories. Category theory was developed by Eilen- f = h◦(g◦f). In the diagram below, if the two berg and Mac Lane in the 1940s. triangles in the diagram each commute, then the parallelogram commutes. Definition 2. A category C consists of f AB- @ @ 1. objects often denoted with uppercase letters, @ @ h ◦ g and @ g @ g ◦ f @ @ @R ? @R 2. morphisms (also called maps or arrows) often CD- denoted with lowercase letters. h 3. Each morphism f has a domain which is an object and a codomain which is also an object. A diagram of objects and morphisms in a cate- If the domain of f is A and the codomain is gory is said to commute, or be a commutative dia- f gram if any two paths of morphisms (in the direc- B, then we write f : A ! B or A ! B. The tion of the arrows) between any two objects yield set of all morphisms from A to B is denoted equal compositions. Hom(A; B). 4. For each object A there is a morphism 1A : Isomorphisms in a category C. Although only A ! A called the identity morphism on A. morphisms are defined in a category, it's easy to determine which ones are isomorphisms. A mor- f g 5. Given two morphisms A ! B and B ! C phism f : A ! B is an isomorphism if there exists where the codomain of one is the same as the another morphism g : B ! A, called its inverse, domain of the other there is another morphism such that f ◦ g = 1B and g ◦ f = 1A. g◦f A −! C called the composition of the two Example 3 (The categories of sets S). Although morphisms. This composition is illustrated by we're more interested in the category of vector the commutative diagram spaces right now, the category S of sets is also rel- f evant. An object in S is a set, and a morphism AB- in S is a function. The domain and codomain of @ @ a morphism are just the domain and codomain of @ g the function, and composition is composition. If S g ◦ f @ @R ? and T are two sets, then Hom(S; T ) is the set of all C functions S ! T . 3 Isomorphisms in the category of sets are bijec- tions. Example 4 (The category of vector spaces VF ). Fix a field F . The objects in the category VF are vector spaces over a F and the morphisms are linear transformations. Different fields have different cat- egories of vector spaces.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages4 Page
-
File Size-