Basics of Linear Algebra
Total Page:16
File Type:pdf, Size:1020Kb
Basics of Linear Algebra Jos and Sophia Vectors ● Linear Algebra Definition: A list of numbers with a magnitude and a direction. ○ Magnitude: a = [4,3] |a| =sqrt(4^2+3^2)= 5 ○ Direction: angle vector points ● Computer Science Definition: A list of numbers. ○ Example: Heights = [60, 68, 72, 67] Dot Product of Vectors Formula: a · b = |a| × |b| × cos(θ) ● Definition: Multiplication of two vectors which results in a scalar value ● In the diagram: ○ |a| is the magnitude (length) of vector a ○ |b| is the magnitude of vector b ○ Θ is the angle between a and b Matrix ● Definition: ● Matrix elements: ● a)Matrix is an arrangement of numbers into rows and columns. ● b) A matrix is an m × n array of scalars from a given field F. The individual values in the matrix are called entries. ● Matrix dimensions: the number of rows and columns of the matrix, in that order. Multiplication of Matrices ● The multiplication of two matrices ● Result matrix dimensions ○ Notation: (Row, Column) ○ Columns of the 1st matrix must equal the rows of the 2nd matrix ○ Result matrix is equal to the number of (1, 2, 3) • (7, 9, 11) = 1×7 +2×9 + 3×11 rows in 1st matrix and the number of = 58 columns in the 2nd matrix ○ Ex. 3 x 4 ॱ 5 x 3 ■ Dot product does not work ○ Ex. 5 x 3 ॱ 3 x 4 ■ Dot product does work ■ Result: 5 x 4 Dot Product Application ● Application: Ray tracing program ○ Quickly create an image with lower quality ○ “Refinement rendering pass” occurs ■ Removes the jagged edges ○ Dot product used to calculate ■ Intersection between a ray and a sphere ■ Measure the length to the intersection points ● Application: Forward Propagation ○ Input matrix * weighted matrix = prediction matrix http://immersivemath.com/ila/ch03_dotprodu ct/ch03.html#fig_dp_ray_tracer Projections One important use of dot products is in projections. Definition(s): Scalar projection: Scalar Projection/Component: The scalar projection of b onto a is the length of the segment AB shown in the figure. Vector Projection: the vector with this length that begins at the point A points in the same Vector projection: direction (or opposite direction if the scalar projection is negative) as a. Thus, the scalar projection of b onto a is the magnitude of the vector projection of b onto a. Transpose ● A matrix is flipped over its diagonal ● It switches the row and column indices of the matrix ○ Denoted: Superscript “T” or AT ● Usually square matrices are transposed ● Importance: The transpose of a matrix can reveal properties of the transformation ○ Ex. Symmetric ○ Ex. Identity matrix Rank A = ● Definition: Minimum number of linearly independent vectors between the rows and columns of a given matrix Echelon form ○ Found: Matrix is in echelon form ● The number of nonzero rows when matrix is echelon form is equal to the A = rank Rank = 2 Vector Space ● Vector space: A vector space V is a set that is closed under finite vector addition and scalar multiplication. They satisfy the following 6 conditions (called axioms): Span ● Example: ○ S = {(0,1,1), (1,0,1), (1,1,0)} ● Definition: v1, v2, …, vk are vectors in a ○ S spans R3 since every vector in R3 vector space V. Then the vectors v1, v2, can be written as a linear ..., vk span the vector space V provided combination of vectors in S. that every vector in V is a linear combination of these k vectors. Subspaces Definition: Vector spaces may be formed A Shortcut for Determining Subspaces: from subsets of other vectors spaces. These are called subspaces. THEOREM 1 A subspace of a vector space V is a subset H If v1,…,vp are in a vector space V, then Span of V that has three properties: {v1,…,vp} is a subspace of V. 1. The zero vector of V is in H. 2. For each u and v are in H, u+v is in H. (In Proof: In order to verify this, check properties 1, 2 and this case we say H is closed under vector 3 of definition of a subspace. addition.) 3. For each u in H and each scalar c, cu is in H. (In this case we say H is closed under scalar multiplication.) If the subset H satisfies these three properties, then H itself is a vector space. Basis Definition: Let V be a vector space. A linearly independent spanning set for V is called a basis. Equivalently, a subset S ⊂ V is a basis for V if any vector v ∈ V is uniquely represented as a linear combination v = r1v1 + r2v2 + · · · + rkvk , where v1, . ., vk are distinct vectors from S and r1, . ., rk ∈ R. Bases for Theorem: Every basis for the vector space consists of n vectors. Theorem: For any vectors v1, v2, . ., vn ∈ the following conditions are equivalent: (i) {v1, v2, . ., vn} is a basis for ; (ii) {v1, v2, . ., vn} is a spanning set for ; (iii) {v1, v2, . ., vn} is a linearly independent set. Inverse (of a matrix ) Important properties of inverse matrix Resources Used https://web.stanford.edu/class/nbio228-01/handouts/Ch4_Linear_Algebra.pdf https://www.mathsisfun.com/algebra/vectors-dot-product.html http://immersivemath.com/ila/ch03_dotproduct/ch03.html https://chortle.ccsu.edu/VectorLessons/vmch13/vmch13_14.html https://www.cliffsnotes.com/study-guides/algebra/linear-algebra/real-euclidean-vector-spaces/the-rank-of-a-matrix https://www.khanacademy.org/math/precalculus/precalc-matrices/intro-to-matrices/a/intro-to-matrices https://www.math.tamu.edu/~yvorobet/MATH304-504/Lect2-06web.pdf https://math.oregonstate.edu/home/programs/undergrad/CalculusQuestStudyGuides/vcalc/dotprod/dotprod.html http://www2.math.uconn.edu/~troby/math2210f16/LT/sec4_1.pdf https://math.mit.edu/~gs/linearalgebra/ila0205.pdf http://www.math.toronto.edu/gscott/WhatVS.pdf.