<<

Final Exam Outline

1.1, 1.2 Scalars and vectors in Rn. The magnitude of a vector. Row and column vectors. Vector addition. multiplication. The zero vector. Vector subtraction. Distance in Rn. Unit vectors. Linear combinations. The . The angle between two vectors. . The CBS and triangle inequalities. Theorem 1.5.

2.1, 2.2 Systems of linear equations. Consistent and inconsistent systems. The coefficient and augmented matrices. Elementary row operations. The echelon form of a . . Equivalent matrices. The reduced row echelon form. Gauss-Jordan elimination. Leading and free variables. The of a matrix. The rank theorem. Homogeneous systems. Theorem 2.2.

2.3 The span of a set of vectors. Linear dependence. Linear dependence and independence. Determining n whether a set {~v1, . . . ,~vk} of vectors in R is linearly dependent or independent.

3.1, 3.2 Matrix operations. , matrix addition, . The zero and identity matrices. Partitioned matrices. The form. The standard unit vectors. The of a matrix. A system of linear equations in matrix form. Algebraic properties of the transpose.

3.3 The inverse of a matrix. Invertible (nonsingular) matrices. The uniqueness of the solution to Ax = b when A is invertible. The inverse of a nonsingular 2 × 2 matrix. Elementary matrices and EROs. Equivalent conditions for invertibility. Using Gauss-Jordan elimination to invert a nonsingular matrix. Theorems 3.7, 3.9.

n n n 3.5 Subspaces of R . {0} and R as subspaces of R . The subspace span(v1,..., vk). The the row, column and null spaces of a matrix. , dimension. Using Gaussian elimination to find a bases for subspaces of the form span(v1, . . . , vk) and for null(A). The nullity of a matrix. Characterizations of rank: rank(A) = dim(col(A)) = dim(row(A)). The rank theorem. The Fundmental Theorem of Invertible Matrices. Coordinates with respect to a basis. Theorems 3.23-3.26, 3.27 (parts), 3.28 and 3.29.

4.1, 4.3 Eigenvalues, eigenvectors and eigenspaces. The characteristic polynomial. Finding a basis for an eigenspace.

4.2 . Cofactor expansions. The Laplace expansion theorem. Determinants of triangular ma- trices. Properties of determinants. Cramer’s rule. The classical adjoint of a . Computing the inverse using the classical adjoint. Theorems 4.2, 4.3 (parts) 4.11 and 4.12.

4.4 Similar matrices. Diagonalization of matrices. Theorems 4.23-4.25.

5.1 Orthogonal sets in Rn. Orthogonal and orthonormal bases. Orthogonal matrices. Theorems 5.1-5.8.

5.2 The orthogonal complement W ⊥ of a subspace W . The orthogonal projection. Theorem 5.9, parts (a), (c) and (d). Theorem 5.11. (The Orthogonal Decomposition Theorem.)

5.3 The Gram-Schmidt procedure.

5.4 The eigenvalues and eigenvectors of real, symmetric matrices. Orthogonal diagonalization of real, symmetric matrices. Theorems 5.17-5.19.