<<

Math 20F, 2015SS1 / TA: Jor-el Briones / Sec: A01 / Handout Page 1 of3

Homogeneous systems (1.5)

Homogenous systems are linear systems in the form Ax = 0, where 0 is the 0 vector.

Given a system Ax = b, suppose x = α + t1α1 + t2α2 + ... + tkαk is a solution (in parametric form) to the system, for any values of t1, t2, ..., tk. Then α is a solution to the system Ax = b (seen by seeting t1 = ... = tk = 0), and α1, α2, ..., αk are solutions to the homoegeneous system Ax = 0.

Terms you should know The zero solution (trivial solution): The zero solution is the 0 vector (a vector with all entries being 0), which is always a solution to the homogeneous system Particular solution: Given a system Ax = b, suppose x = α + t1α1 + t2α2 + ... + tkαk is a solution (in parametric form) to the system, α is the particular solution to the system. The other terms constitute the solution to the associated homogeneous system Ax = 0.

Properties of a Homegeneous System 1. A homogeneous system is ALWAYS consistent, since the zero solution, aka the trivial solution, is always a solution to that system.

2. A homogeneous system with at least one free variable has infinitely many solutions.

3. A homogeneous system with more unknowns than equations has infinitely many solu- tions

4. If α1, α2, ..., αk are solutions to a homogeneous system, then ANY of α1, α2, ..., αk is also a solution to the homogeneous system

Important theorems to know Theorem. (Chapter 1, Theorem 6) Given a consistent system Ax = b, suppose α is a solution to the system. Then the solution set of Ax = b is the set of all vectors of the form x = α + t1α1 + t2α2 + ... + tkαk, where y = t1α1 + t2α2 + ... + tkαk is the parametric form of the solution to the homogeneous system Ax = 0.

Linear Independence and Dependence (1.7)

Terms you should know Linearly Dependent: A set of vectors is linearly dependent if at least one vector of the set can be expressed as alinear combination of the other vectors in that set. Note that this is equivalent to the homogeneous system having a non-zero solution. Math 20F, 2015SS1 / TA: Jor-el Briones / Sec: A01 / Handout Page 2 of3

Linearly independent: A set of vectors is linearly independent if the set is NOT linearly dependent. So NONE of the vectors may be written as a linear combination of any of the others. Note that this is equivalent to the homogeneous system having ONLY the zero solution.

Important theorems to know

n Theorem. (Chapter 1, Theorem 8) Let v1, v2, ..., vk be vectors in R . If k > n, then v1, v2, ..., vk are linearly dependent.

Theorem. (Chapter 1, Theorem 9) If a set S = {v1, v2, ..., vk} contains the zero vector, then the set is linearly dependent.

Matrix Operations and Inverses (2.1-2.2)

Terms you should know I: An I is a with 1’s along the main diagonal (the ith row and ith column) and 0’s for every other entry. Examples:

 1 0 0 0   1 0 0   1 0  0 1 0 0 0 1 0   0 1    0 0 1 0  0 0 1   0 0 0 1 Invertibility: A matrix A is invertible if there is a matrix B such that AB = I and BA = I, where I is an identity marix.

Transpose: Let A be some m × n matrix. The of A, which we write as AT , is the n × m matrix where the entry in the ith column and jth row of AT is equal to the entry in the ith row and jth column of A. Example:

 1 2 3 4   1 5 9 13   1 2   1 3 5  5 6 7 8 2 6 10 14 A = 3 4 AT = B =   BT =     2 4 6  9 10 11 12   3 7 11 15  5 6     13 14 15 16 4 8 12 16

Matrix Multiplication

If A is an m × n matrix and B is an n × p matrix with columns b1, b2, ..., bp, then the matrix product AB is the m × p matrix whose columns are Ab1, Ab2, ..., Abp. h i h i AB = A b1 b2 ... bp = Ab1 Ab2 ... Abp NOTE: Matrices do NOT work exactly like numbers. You cannot ”divide by” a matrix, for example. Keep in mind the following facts: Math 20F, 2015SS1 / TA: Jor-el Briones / Sec: A01 / Handout Page 3 of3

1. AB 6= BA (sometimes the two are equal, but most of the time it is not) 2. If AB = AC, it is not true in general that B = C (you can’t just cancel out terms all the time). 3. If the product AB is the , you CANNOT in general say that either A = 0 or B = 0.

Important theorems to know Theorem. (Chapter 2, Theorem 1) Let A, B, and C be atrices of the same size, and let r and s be numbers. Then the following hold:

1. A + B = B + A 4. r(A + B) = rA + rB

2. (A + B) + C = A + (B + C) 5. (r + s)A = rA + sA

3. A + 0 = A 6. r(sA) = (rs)A

Theorem. (Chapter 2, Theorem 2) Let A be an m × n matrix and let B nd C be matrices with sizes so that the following multiplications are valid

1. A(BC) = (AB)C 4. r(AB) = (rA)B = r(AB) for any num- ber r 2. A(B + C) = AB + AC

3. (B + C)A = BA + CA 5. ImA = A = AIn

Theorem. (Chapter 2, Theorem 3) Let A and B denote matrices whose sizes allow the following operations to be valid. Then the following statements are true.

1. (AT )T = A 3. For any number r, r(AT ) = rAT

2. (A + B)T = AT + BT 4. (AB)T = BT AT

Theorem. (Chapter 2, Theorem 5) If A is an , then for each and every b in Rn, the equation Ax = b has the unique solution x = A−1b Theorem. (Chapter 2, Theorem 6) 1. If A is an invertile matrix, then its inverse A−1 is also invertible, and (A−1)−1 = A 2. If A and B are n × n invertible matrices, then so is AB, and the inverse of AB is the product of the inverses of A and B in the reverse order. That is, (AB)−1 = B−1A−1

3. If A is an invertible matrix, then so is AT , and the inverse of AT is the transpose of A−1 (AT )−1 = (A−1)T