Linear Algebra (VII)

Total Page:16

File Type:pdf, Size:1020Kb

Linear Algebra (VII) Linear Algebra (VII) Yijia Chen 1. Review Basis and Dimension. We fix a vector space V. Lemma 1.1. Let A, B ⊆ V be two finite sets of vectors in V, possibly empty. If A is linearly indepen- dent and can be represented by B. Then jAj 6 jBj. Theorem 1.2. Let S ⊆ V and A, B ⊆ S be both maximally linearly independent in S. Then jAj = jBj. Definition 1.3. Let e1,..., en 2 V. Assume that – e1,..., en are linearly independent, – and every v 2 V can be represented by e1,..., en. Equivalently, fe1,..., eng is maximally linearly independent in V. Then fe1,..., eng is a basis of V. Note that n = 0 is allowed, and in that case, it is easy to see that V = f0g. By Theorem 1.2: 0 0 Lemma 1.4. If fe1,..., eng and fe1,..., emg be both bases of V with pairwise distinct ei’s and with 0 pairwise distinct ei, then n = m. Definition 1.5. Let fe1,..., eng be a basis of V with pairwise distinct ei’s. Then the dimension of V, denoted by dim(V), is n. Equivalently, if rank(V) is defined, then dim(V) := rank(V). 1 Theorem 1.6. Assume dim(V) = n and let u1,..., un 2 V. (1) If u1,..., un are linearly independent, then fu1,..., ung is a basis. (2) If every v 2 V can be represented by u1,..., un, then fu1,..., ung is a basis. Steinitz exchange lemma. Theorem 1.7. Assume that dim(V) = n and v1,..., vm 2 V with 1 6 m 6 n are linearly indepen- dent. Furthermore, let fe1,..., eng be a basis of V. Then for some 1 6 i1 < i2 < ··· < in-m 6 n v1,..., vm, ei1 ,..., ein-m is a basis of V. 1 We do not assume beforehand that u1,..., un are pairwise distinct, although under the conditions in the theorem, they have to be, i.e., ui 6= uj for every 1 6 i < j 6 n. 1 Proof: We prove by induction on m and start with m = 1. Since fe1,..., eng is a basis, v1 can be represented by e1,..., en. Thus, there exist a1,..., an 2 R such that v1 = aiei. iX2[n] As v1 6= 0 (otherwise, v1 is linearly dependent), there is an i 2 [n] with ai 6= 0. It follows that 1 aj aj ei = v1 + - · ei + - · ei. ai ai ai i<j n j2X[i-1] X6 In other words, ei can be represented by fv1, e1,..., ei-1, ei+1,..., eng. Thus fe1,..., eng can be represented by fv1, e1,..., ei-1, ei+1,..., eng, and so does every vector in α, since fe1,..., eng is a basis. By Theorem 1.6 (2), fv1, e1,..., ei-1, ei+1,..., eng is a basis. Now assume that m > 1 and v1,..., vm 2 V are linearly independent. Of course v1,..., vm-1 are linearly independent too. By induction hypothesis on m - 1, there exist 1 6 i1 < i2 < ··· < in-m+1 6 n such that v1,..., vm-1, ei1 ,..., ein-m+1 is a basis of V. In particular, vmcan be represented by this basis, i.e., there exist a1,..., am-1, c1,..., cn-m+1 2 R such that vm = aivi + cjeij . i2X[m-1] j2[nX-m+1] Observe that j2[n-m+1] cjeij 6= 0, otherwise, v1,..., vm would be linearly dependent. Thus, for some j 2 [n - m + 1] we have c 6= 0, and thereby P j ai 1 c` eij = - · vi + · vm + - · ei` cj cj cj i2X[m-1] `2[n-Xm+1]nfjg Then it is easy to see that v1,..., vm, ei1 ,..., eij-1 , eij+1 ,..., ein-m+1 is a basis of V. 2 Remark 1.8. (i) The above proof is in fact essentially the same proof for Lemma 1.1. (ii) Again, we can drop the requirement 1 6 m, in particular, the case m = 0 holds trivially. 2. Back to the Textbook Matrix and matrix operations. Recall an m × n matrix has the form 2 3 a11 a12 ··· a1n 6 a21 a21 ··· a2n 7 A = 6 7 = a , 6 . .. 7 ij m×n 4 . 5 am1 a21 ··· amn T where each aij 2 R. The transpose matrix of A, denoted by A , is the n × m matrix 2 3 a11 a21 ··· am1 6a12 a22 ··· am2 7 6 7 . 6 . .. 7 4 . 5 a1n a2n ··· amn 2 Definition 2.1 (Matrix Addition). Let A = aij m×n and B = bij m×n be two m × n-matrices. Then 2 3 a11 + b11 ··· a1n + b1n A + B := a + b = 6 . .. 7 . ij ij m×n 4 . 5 am1 + bm1 ··· amn + bmn Definition 2.2. The zero m × n-matrix is 20 ··· 03 6 . .. 7 0m×n = 4 . 5 . 0 ··· 0 When m, n are clear from the context, we write 0 instead of 0m×n. Lemma 2.3. (i) A + B = B + A. (ii) (A + B) + C = A + (B + C). (iii) (A + B)T = AT + BT . Definition 2.4 (Scalar Multiplication). Let A = aij m×n be an m × n-matrix and k 2 R. Then 2 3 k · a11 ··· k · a1n k · A := k · a = 6 . .. 7 . ij m×n 4 . 5 k · am1 ··· k · amn Lemma 2.5. Let A and B be two m × n-matrices and k, ` 2 N. (i) k · (` · A) = (k · `) · A. (ii) (k + `) · A = k · A + ` · A. (iii) k · (A + B) = k · A + k · B. (iv) (k · A)T = k · AT . Definition 2.6. For every A = aij m×n we define -A := -1 · A = - aij m×n. Lemma 2.7. A + (-A) = 0. 2.1. Matrix multiplication. Definition 2.8. Let m, n, r > 1, A be an m × r-matrix, and B an r × n-matrix. Then C := AB = cij m×n is an m × n-matrix where each cij := ai`b`j. `X2[r] 3 Although matrix multiplication seems arbitrary at first sight, we have seen that it could be understood in the context of substituting the variables in one system of linear equations by another system of linear equations. Three matrices, A, B, and C, correspond to the coefficients of the three systems. Remark 2.9. Compared to most multiplications we have encountered before, matrix multiplica- tion is not commutative. – AB and BA can have different size, or even one of them is not defined. For instance, A = 213 1 0 4 and B = 415. Then AB is a 1 × 1-matrix, while BA is 3 × 3. 0 0 0 – Even if they have the same size, AB and BA can be different matrices. Let A := and 0 1 0 1 0 0 0 1 B := . Then AB = and BA = . 0 0 0 0 0 0 This is hardly a surprise by viewing AB as substituting variables xi’s with yi’s and BA the other way around for two systems of linear equations. Definition 2.10. For n > 1 we define 21 0 ··· 03 60 1 ··· 07 I = 6 7 . m 6 . .. 7 4 . 5 0 0 ··· 1 2 In is called the (n × n) identity matrix. Lemma 2.11. (i) (AB)C = A(BC). (ii) C(A + B) = CA + CB = and (A + B)C = AC + BC. (iii) k · (AB) = (k · A)B = A(k · B). (iv) Assume that A is an m × n-matrix. Then ImA = AIn = A. 3. Block Matrix For mostly computational reasons, sometimes we need to partition a matrix into several submatri- ces, or more precisely, block matrices. The following is an example. 2 1 2 -1 0 3 A A A = 6 2 5 0 -2 7 = 11 12 4 5 A A 3 1 -1 3 21 22 where 1 2 -1 0 A = , A = , 11 2 5 0 12 -2 A21 = 3 1 -1 , A22 = 3 . 2 Note that this is not the identity in the matrix space Mn×n(R), which is the scalar 1 2 R. 4 In general, for some p, q 2 N with 1 6 p 6 m and 1 6 q 6 n we break m rows of A into p blocks, and similarly break n columns of A into q blocks: n1 columns nq columns 2 3 m1 rows A11 A12 ··· A1q 6A A ··· A 7 6 21 22 2q 7 6 7 6 . .. 7 4 . 5 mp rows A A ··· A r1 p2 pq . Here, each Aij is an mi × nj-matrix for i 2 [p] and j 2 [q]. Remark 3.1. Note for fixed p and q, we might still have different Aij p×q, which actually de- pends on the choices of mi’s and nj’s. Example 3.2. Let 2 1 1 0 0 0 3 2 3 6 -1 1 0 0 0 7 A1 0 0 6 7 A = 6 0 0 1 0 0 7 = 4 0 A2 0 5 , 6 7 4 0 0 1 1 0 5 0 0 A3 0 0 0 0 1 where 1 1 1 0 A = , A = and A = 1 . 1 -1 1 2 1 1 3 The matrix 2 3 A1 0 0 4 0 A2 0 5 0 0 A3 is a block diagonal matrix. 3.1. Operations on block matrices. Transpose. Assume 2 3 A11 A12 ··· A1q 6A21 A22 ··· A2q 7 A = 6 7 . 6 . .. 7 4 . 5 Ap1 Ap2 ··· Apq Then 2 T T T 3 A11 A21 ··· Ap1 T T 6A12 A22 ··· Ap2 7 AT = 6 7 . 6 . .. 7 4 . 5 T T T A1q A2q ··· Apq Addition.
Recommended publications
  • Discover Linear Algebra Incomplete Preliminary Draft
    Discover Linear Algebra Incomplete Preliminary Draft Date: November 28, 2017 L´aszl´oBabai in collaboration with Noah Halford All rights reserved. Approved for instructional use only. Commercial distribution prohibited. c 2016 L´aszl´oBabai. Last updated: November 10, 2016 Preface TO BE WRITTEN. Babai: Discover Linear Algebra. ii This chapter last updated August 21, 2016 c 2016 L´aszl´oBabai. Contents Notation ix I Matrix Theory 1 Introduction to Part I 2 1 (F, R) Column Vectors 3 1.1 (F) Column vector basics . 3 1.1.1 The domain of scalars . 3 1.2 (F) Subspaces and span . 6 1.3 (F) Linear independence and the First Miracle of Linear Algebra . 8 1.4 (F) Dot product . 12 1.5 (R) Dot product over R ................................. 14 1.6 (F) Additional exercises . 14 2 (F) Matrices 15 2.1 Matrix basics . 15 2.2 Matrix multiplication . 18 2.3 Arithmetic of diagonal and triangular matrices . 22 2.4 Permutation Matrices . 24 2.5 Additional exercises . 26 3 (F) Matrix Rank 28 3.1 Column and row rank . 28 iii iv CONTENTS 3.2 Elementary operations and Gaussian elimination . 29 3.3 Invariance of column and row rank, the Second Miracle of Linear Algebra . 31 3.4 Matrix rank and invertibility . 33 3.5 Codimension (optional) . 34 3.6 Additional exercises . 35 4 (F) Theory of Systems of Linear Equations I: Qualitative Theory 38 4.1 Homogeneous systems of linear equations . 38 4.2 General systems of linear equations . 40 5 (F, R) Affine and Convex Combinations (optional) 42 5.1 (F) Affine combinations .
    [Show full text]
  • Matroidal Subdivisions, Dressians and Tropical Grassmannians
    Matroidal subdivisions, Dressians and tropical Grassmannians vorgelegt von Diplom-Mathematiker Benjamin Frederik Schröter geboren in Frankfurt am Main Von der Fakultät II – Mathematik und Naturwissenschaften der Technischen Universität Berlin zur Erlangung des akademischen Grades Doktor der Naturwissenschaften – Dr. rer. nat. – genehmigte Dissertation Promotionsausschuss: Vorsitzender: Prof. Dr. Wilhelm Stannat Gutachter: Prof. Dr. Michael Joswig Prof. Dr. Hannah Markwig Senior Lecturer Ph.D. Alex Fink Tag der wissenschaftlichen Aussprache: 17. November 2017 Berlin 2018 Zusammenfassung In dieser Arbeit untersuchen wir verschiedene Aspekte von tropischen linearen Räumen und deren Modulräumen, den tropischen Grassmannschen und Dressschen. Tropische lineare Räume sind dual zu Matroidunterteilungen. Motiviert durch das Konzept der Splits, dem einfachsten Fall einer polytopalen Unterteilung, wird eine neue Klasse von Matroiden eingeführt, die mit Techniken der polyedrischen Geometrie untersucht werden kann. Diese Klasse ist sehr groß, da sie alle Paving-Matroide und weitere Matroide enthält. Die strukturellen Eigenschaften von Split-Matroiden können genutzt werden, um neue Ergebnisse in der tropischen Geometrie zu erzielen. Vor allem verwenden wir diese, um Strahlen der tropischen Grassmannschen zu konstruieren und die Dimension der Dressschen zu bestimmen. Dazu wird die Beziehung zwischen der Realisierbarkeit von Matroiden und der von tropischen linearen Räumen weiter entwickelt. Die Strahlen einer Dressschen entsprechen den Facetten des Sekundärpolytops eines Hypersimplexes. Eine besondere Klasse von Facetten bildet die Verallgemeinerung von Splits, die wir Multi-Splits nennen und die Herrmann ursprünglich als k-Splits bezeichnet hat. Wir geben eine explizite kombinatorische Beschreibung aller Multi-Splits eines Hypersimplexes. Diese korrespondieren mit Nested-Matroiden. Über die tropische Stiefelabbildung erhalten wir eine Beschreibung aller Multi-Splits für Produkte von Simplexen.
    [Show full text]
  • 1 Sets and Set Notation. Definition 1 (Naive Definition of a Set)
    LINEAR ALGEBRA MATH 2700.006 SPRING 2013 (COHEN) LECTURE NOTES 1 Sets and Set Notation. Definition 1 (Naive Definition of a Set). A set is any collection of objects, called the elements of that set. We will most often name sets using capital letters, like A, B, X, Y , etc., while the elements of a set will usually be given lower-case letters, like x, y, z, v, etc. Two sets X and Y are called equal if X and Y consist of exactly the same elements. In this case we write X = Y . Example 1 (Examples of Sets). (1) Let X be the collection of all integers greater than or equal to 5 and strictly less than 10. Then X is a set, and we may write: X = f5; 6; 7; 8; 9g The above notation is an example of a set being described explicitly, i.e. just by listing out all of its elements. The set brackets {· · ·} indicate that we are talking about a set and not a number, sequence, or other mathematical object. (2) Let E be the set of all even natural numbers. We may write: E = f0; 2; 4; 6; 8; :::g This is an example of an explicity described set with infinitely many elements. The ellipsis (:::) in the above notation is used somewhat informally, but in this case its meaning, that we should \continue counting forever," is clear from the context. (3) Let Y be the collection of all real numbers greater than or equal to 5 and strictly less than 10. Recalling notation from previous math courses, we may write: Y = [5; 10) This is an example of using interval notation to describe a set.
    [Show full text]
  • Math 121: Linear Algebra and Applications
    Math 121: Linear Algebra and Applications Course Taught by Elden Elmanto Notes Taken by Forrest Flesher Fall 2019 Welcome to Math 121: Linear Algebra and Applications. I’m Forrest, your course assistant for the semester. Some important information: • The course webpage is: https://scholar.harvard.edu/elden/classes/math-121-fall-2019 • Office hour times, course times and more information is located in the syllabus, which is at: https://scholar.harvard.edu/files/elden/files/syllabus-121_fall_2019-final. pdf Office hours are at the following times in Science Center 231: – Sundays: 2:00-3:00 pm – Thursdays: 2:00-3:00 pm • I will hold two office hours per week. The first Friday office hour willbean interactive LaTeX introduction session. My office hours are at the following times: – Mondays: 8:00-10:00 pm, Leverett Dining hall – Fridays: 3:00-4:15 pm, Science Center 309A • The text for the course is Axler’s “Linear Algebra Done Right.” • I will by typing up notes in class, since there will be times when the course deviates from the book. • My email is [email protected]. Email with any questions, comments, or concerns, especially if you find a mistake in these notes. • We will use the Canvas site for submitting/grading problem sets. • For submitting your problem sets, please use LATEX. This is strongly encouraged for the first few psets, and required after the fifth pset. If you are newtexand need help getting started, come to office hours. I’d recommend using overleaf: https://www.overleaf.com 1 Math 121: Linear Algebra and Applications Fall 2019 Contents 1 September 3, 2019 4 1.1 Introduction ....................................
    [Show full text]
  • DRAFT Syllabus and Synopses for the Preliminary Examination in Mathematics 2014–2015 for Examination in 2015
    DRAFT Syllabus and Synopses for the Preliminary Examination in Mathematics 2014{2015 for examination in 2015 Contents 1 Foreword 3 2 Syllabus 5 2.1 Mathematics I . .5 2.2 Mathematics II . .7 2.3 Mathematics III . .8 2.4 Mathematics IV . 10 2.5 Mathematics V . 11 3 Mathematics I 12 3.1 Introductory Courses . 12 3.1.1 Introduction to University Level Mathematics | Dr. Neumann | 8 MT .................................... 12 3.1.2 Introduction to Complex Numbers | Prof. Kirwan | 2 MT . 13 3.2 Linear Algebra I | Prof. Kirwan | 14 MT . 14 3.3 Linear Algebra II | Prof. Lauder | 8HT . 16 3.4 Groups and Group Actions | Dr. Neale | 8 HT and 8 TT . 17 4 Mathematics II 19 4.1 Analysis I: Sequences and Series | Prof. Priestley | 14 MT . 19 4.2 Analysis II: Continuity and Differentiability | Prof. Heath-Brown | 16 HT 20 4.3 Analysis III: Integration | Prof. Capdeboscq | 8 TT . 22 5 Mathematics III 24 1 5.1 Introductory Calculus | Dr. Wilkins | 16 MT . 24 5.2 Probability | Prof. Goldschmidt | 16 MT . 25 5.3 Statistics | Dr. Laws | 8 HT . 26 6 Mathematics IV 28 6.1 Geometry | Prof. Lauder | 7 MT . 28 6.2 Dynamics | Dr Earl | 8 MT and 8 HT . 29 6.3 Optimisation | Prof. McDiarmid | 8 TT . 30 6.4 Constructive Mathematics | Prof. Hauser | 8 TT . 30 7 Mathematics V 32 7.1 Multivariable Calculus | Prof. Gaffney | 16 HT . 32 7.2 Fourier Series and PDEs | Prof. Baker | 16 HT . 33 7.3 Applications | Prof. Mason | 8 TT . 34 8 Computational Mathematics 35 8.1 Computational Mathematics | Dr.
    [Show full text]
  • Linear Algebra I: 2017/18 Revision Checklist for the Examination
    Linear Algebra I: 2017/18 Revision Checklist for the Examination The examination tests knowledge of three definitions, one theorem with its proof (in the written part) and a survey question on one topic (in the oral part). Each definition is followed by the request for illustrative examples or a straightforward problem involving the defined terms. Survey questions involve providing definitions, giving theorem statements, examples and relationships between ideas { proofs for this part are not required beyond the key notions, but you will be asked to recall theorem statements. (In the written part you will be given the statement of a theorem, the task being there to prove it). The oral part also involves a discussion of your answers to the written part. The examination consists of at most an hour on the written part and a discussion for up to half an hour. The oral part may come in two parts: as there may be other people taking the examination concurrently, you may be asked to begin with the survey topic discussion before doing the written part, and then return for a short second discussion of the written part once you have completed it. You may need to wait for a period after finishing the written part to be called for the discussion part. Survey topics The following gives an indication of likely topics that you may be asked about during the oral part of the examination (be prepared to give definitions, examples, algorithm descriptions, theorems, notable corollaries etc. { proofs will not be asked for in this part). • elementary row operations
    [Show full text]
  • LINEAR ALGEBRA Contents 1. Vector Spaces 2 1.1. Definitions
    LINEAR ALGEBRA SIMON WADSLEY Contents 1. Vector spaces2 1.1. Definitions and examples2 1.2. Linear independence, bases and the Steinitz exchange lemma4 1.3. Direct sum8 2. Linear maps9 2.1. Definitions and examples9 2.2. Linear maps and matrices 12 2.3. The first isomorphism theorem and the rank-nullity theorem 14 2.4. Change of basis 16 2.5. Elementary matrix operations 17 3. Duality 19 3.1. Dual spaces 19 3.2. Dual maps 21 4. Bilinear Forms (I) 23 5. Determinants of matrices 25 6. Endomorphisms 30 6.1. Invariants 30 6.2. Minimal polynomials 32 6.3. The Cayley-Hamilton Theorem 36 6.4. Multiplicities of eigenvalues and Jordan Normal Form 39 7. Bilinear forms (II) 44 7.1. Symmetric bilinear forms and quadratic forms 44 7.2. Hermitian forms 49 8. Inner product spaces 50 8.1. Definitions and basic properties 50 8.2. Gram{Schmidt orthogonalisation 51 8.3. Adjoints 53 8.4. Spectral theory 56 Date: Michaelmas 2015. 1 2 SIMON WADSLEY Lecture 1 1. Vector spaces Linear algebra can be summarised as the study of vector spaces and linear maps between them. This is a second ‘first course' in Linear Algebra. That is to say, we will define everything we use but will assume some familiarity with the concepts (picked up from the IA course Vectors & Matrices for example). 1.1. Definitions and examples. Examples. (1) For each non-negative integer n, the set Rn of column vectors of length n with real entries is a vector space (over R).
    [Show full text]
  • MATHEMATICAL TRIPOS Part IB PAPER 1 Before You Begin Read
    MATHEMATICAL TRIPOS Part IB Friday, 30 May, 2014 1:30 pm to 4:30 pm PAPER 1 Before you begin read these instructions carefully. Each question in Section II carries twice the number of marks of each question in Section I. Candidates may attempt at most four questions from Section I and at most six questions from Section II. Complete answers are preferred to fragments. Write on one side of the paper only and begin each answer on a separate sheet. Write legibly; otherwise, you place yourself at a grave disadvantage. At the end of the examination: Tie up your answers in separate bundles labelled A, B, ::: ,H according to the examiner letter affixed to each question, including in the same bundle questions from Sections I and II with the same examiner letter. Attach a completed gold cover sheet to each bundle. You must also complete a green master cover sheet listing all the questions you have attempted. Every cover sheet must bear your examination number and desk number. STATIONERY REQUIREMENTS SPECIAL REQUIREMENTS Gold cover sheets None Green master cover sheet You may not start to read the questions printed on the subsequent pages until instructed to do so by the Invigilator. 2 SECTION I 1G Linear Algebra State and prove the Steinitz Exchange Lemma. Use it to prove that, in a finite- dimensional vector space: any two bases have the same size, and every linearly independent set extends to a basis. n Let e1,...,en be the standard basis for R . Is e1 + e2, e2 + e3, e3 + e1 a basis for 3 4 R ? Is e1 + e2, e2 + e3, e3 + e4, e4 + e1 a basis for R ? Justify your answers.
    [Show full text]
  • Linear Algebra Theorems
    Part IB | Linear Algebra Theorems Based on lectures by S. J. Wadsley Notes taken by Dexter Chua Michaelmas 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures. They are nowhere near accurate representations of what was actually lectured, and in particular, all errors are almost surely mine. Definition of a vector space (over R or C), subspaces, the space spanned by a subset. Linear independence, bases, dimension. Direct sums and complementary subspaces. [3] Linear maps, isomorphisms. Relation between rank and nullity. The space of linear maps from U to V , representation by matrices. Change of basis. Row rank and column rank. [4] Determinant and trace of a square matrix. Determinant of a product of two matrices and of the inverse matrix. Determinant of an endomorphism. The adjugate matrix. [3] Eigenvalues and eigenvectors. Diagonal and triangular forms. Characteristic and minimal polynomials. Cayley-Hamilton Theorem over C. Algebraic and geometric multiplicity of eigenvalues. Statement and illustration of Jordan normal form. [4] Dual of a finite-dimensional vector space, dual bases and maps. Matrix representation, rank and determinant of dual map. [2] Bilinear forms. Matrix representation, change of basis. Symmetric forms and their link with quadratic forms. Diagonalisation of quadratic forms. Law of inertia, classification by rank and signature. Complex Hermitian forms. [4] Inner product spaces, orthonormal sets, orthogonal projection, V = W ⊕ W ?. Gram- Schmidt orthogonalisation. Adjoints. Diagonalisation of Hermitian matrices. Orthogo- nality of eigenvectors and properties of eigenvalues. [4] 1 Contents IB Linear Algebra (Theorems) Contents 0 Introduction 3 1 Vector spaces 4 1.1 Definitions and examples .
    [Show full text]
  • Darij Grinberg, Notes on Linear Algebra
    Notes on linear algebra Darij Grinberg Wednesday 4th December, 2019 at 15:09 These notes are frozen in a (very) unfinished state. Currently, only the basics of matrix algebra have been completed (products, triangularity, row operations etc.). Contents 1. Preface3 1.1. Acknowledgments . .4 2. Introduction to matrices4 2.1. Matrices and entries . .4 2.2. The matrix builder notation . .6 2.3. Row and column vectors . .7 2.4. Transposes . .8 2.5. Addition, scaling and multiplication . .9 2.6. The matrix product rewritten . 13 2.7. Properties of matrix operations . 18 2.8. Non-properties of matrix operations . 21 2.9. (*) The summation sign, and a proof of (AB) C = A (BC) ....... 22 2.10. The zero matrix . 31 2.11. The identity matrix . 32 2.12. (*) Proof of AIn = A ............................ 33 2.13. Powers of a matrix . 35 2.14. (*) The summation sign for matrices . 36 2.15. (*) Application: Fibonacci numbers . 40 2.16. (*) What is a number? . 43 3. Gaussian elimination 51 3.1. Linear equations and matrices . 51 3.2. Inverse matrices . 53 3.3. More on transposes . 61 1 Notes on linear algebra (Wednesday 4th December, 2019, 15:09) page 2 3.4. Triangular matrices . 62 3.5. (*) Proof of Proposition 3.32 . 71 3.6. The standard matrix units Eu,v ...................... 75 3.7. (*) A bit more on the standard matrix units . 77 l 3.8. The l-addition matrices Au,v ....................... 84 3.9. (*) Some proofs about the l-addition matrices . 86 l 3.10. Unitriangular matrices are products of Au,v’s .
    [Show full text]
  • C 2020 Minghao Liu ALGEBRAIC DEPENDENCE TESTING in the PERSPECTIVE of ALGEBRAIC MATROIDS
    c 2020 Minghao Liu ALGEBRAIC DEPENDENCE TESTING IN THE PERSPECTIVE OF ALGEBRAIC MATROIDS BY MINGHAO LIU THESIS Submitted in partial fulfillment of the requirements for the degree of Master of Science in Computer Science in the Graduate College of the University of Illinois at Urbana-Champaign, 2020 Urbana, Illinois Adviser: Professor Michael A. Forbes ABSTRACT We study the computational problem called algebraic dependence testing, where we seek to find a polynomial relation among a set of polynomials. This notion generalizes linear dependence, and understanding it has had applications to deterministic polynomial iden- tity testing and algebraic circuit lower bounds. We present previous works on this topic including Perron's bound on the annihilating polynomial and the Jacobian criterion. By Perron's bound, there is a brute-force algorithm that solves for the annihilating polynomial in exponential time. By the Jacobian criterion, in fields with large or zero characteristics, we can represent the polynomials with a set of vectors while preserving independence, thus reducing the problem to linear dependence testing. We present the above results and discuss their discrepancy in complexity. While algebraic independence gives rise to a class of matroids, this relation is rarely discussed in the computer science literature. We then describe the previous results on algebraic dependence testing in the perspective of algebraic matroids, and hope to provide powerful tools and novel directions to explore on this topic in future. ii TABLE OF CONTENTS CHAPTER 1 INTRODUCTION . 1 1.1 Organization . 2 1.2 Notation . 2 CHAPTER 2 ALGEBRAIC DEPENDENCE TESTING . 4 2.1 Algebraic Dependence . 4 2.2 Algebraic Dependence Testing .
    [Show full text]
  • Linear Algebra Theorems with Proof
    Part IB | Linear Algebra Theorems with proof Based on lectures by S. J. Wadsley Notes taken by Dexter Chua Michaelmas 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures. They are nowhere near accurate representations of what was actually lectured, and in particular, all errors are almost surely mine. Definition of a vector space (over R or C), subspaces, the space spanned by a subset. Linear independence, bases, dimension. Direct sums and complementary subspaces. [3] Linear maps, isomorphisms. Relation between rank and nullity. The space of linear maps from U to V , representation by matrices. Change of basis. Row rank and column rank. [4] Determinant and trace of a square matrix. Determinant of a product of two matrices and of the inverse matrix. Determinant of an endomorphism. The adjugate matrix. [3] Eigenvalues and eigenvectors. Diagonal and triangular forms. Characteristic and minimal polynomials. Cayley-Hamilton Theorem over C. Algebraic and geometric multiplicity of eigenvalues. Statement and illustration of Jordan normal form. [4] Dual of a finite-dimensional vector space, dual bases and maps. Matrix representation, rank and determinant of dual map. [2] Bilinear forms. Matrix representation, change of basis. Symmetric forms and their link with quadratic forms. Diagonalisation of quadratic forms. Law of inertia, classification by rank and signature. Complex Hermitian forms. [4] Inner product spaces, orthonormal sets, orthogonal projection, V = W ⊕ W ?. Gram- Schmidt orthogonalisation. Adjoints. Diagonalisation of Hermitian matrices. Orthogo- nality of eigenvectors and properties of eigenvalues. [4] 1 Contents IB Linear Algebra (Theorems with proof) Contents 0 Introduction 3 1 Vector spaces 4 1.1 Definitions and examples .
    [Show full text]