
Basics from linear algebra Definition. A vector space is a set V with the operations of addition + : V × V ! V; denoted ~w + ~v = +(~v; ~w); where ~v; ~w 2 V and multiplication by a scalar · : R × V ! V denoted r~v = ·(r;~v); where r 2 R~v 2 V such that the following hold: (1) We have ~v + (~w + ~u) = (~v + ~w) + ~u for all ~v; ~w; ~u 2 V . (2) We have ~v + ~w = ~w + ~v for all ~v; ~w 2 V . (3) There exists an element ~0 2 V such that for every ~v 2 V we have ~0 + ~v = ~v + ~0 = ~v. (One can prove that if an element ~0 with this property exists then such an element is unique). (4) For every ~v 2 V there exists an element ~w 2 V such that ~v + ~w = ~w + ~v = ~0. Again, one can show that for any given ~v an element ~w with this property is unique, and it is denoted ~w = −~v. (5) For every ~v 2 V we have 1 · ~v = ~v. (6) For every r 2 R and for all ~v; ~w 2 V we have r(~v + ~w) = r~v + r ~w. (7) For every r; s 2 R and every ~v 2 V we have (r + s)~v = r~v + r ~w. Elements ~v of a vector space V are called vectors. Examples: n (1) If n ≥ 1 is an integer, then the Euclidean space R , with the standard operations of addition and multiplication by a scalar, is a vector space. (2) The set Mn;n(R) of all n×n matrices with entries in R, with the stan- dard operations of matrix addition and multiplication by a scalar, is a vector space. (3) If X is a nonempty set, then the set F (X; R) of all functions f : X ! R, with point-wise addition and point-wise multiplication by a scalar, is a vector space. That is, for f; g : X ! R, f +g : X ! R is defined as (f +g)(x) = f(x)+g(x) for all x 2 X. Similarly, if r 2 R and f : X ! R, then the function rf : X ! R is defined as (rf)(x) := rf(x), where x 2 X. Basic properties of vector spaces. Let V be a vector space. Then: (1) We have 0 · ~v = ~0 for every ~v 2 V . (2) We have (−1) · ~v = −~v for all ~v 2 V . Definition. Let V be a vector space and let ~v1; : : : ;~vm 2 V be m vectors in V (where m ≥ 1). We say that ~v1; : : : ;~vm are linearly independent in ~ V if whenever c1; : : : ; cm 2 R are such that c1~v1 + : : : cm~vm = 0 then c1 = ··· = cm = 0. The vectors ~v1; : : : ;~vm are linearly dependent if they are not linearly independent. 1 2 Thus ~v1; : : : ;~vm are linearly dependent if and only if there exist c1; : : : cm 2 ~ R such that c1~v1 + : : : cm~vm = 0 but that ci 6= 0 for some i. Example. 3 (1) The vectors ~v1 = (0; 1; 3);~v2 = (−1; 1; 2) 2 R are linearly indepen- 3 dent in R . 3 (2) The vectors ~v1 = (0; 1; 3);~v2 = (−1; 1; 2);~v3 = (−2; 3; 7) 2 R are 3 ~ linearly dependent in R . Indeed 1·v1 + 2~v2 + (−1)~v3 = (0; 0; 0) = 0. 3 (3) The vectors ~v1 = (0; 1; 3);~v2 = (0; 0; 0) 2 R are linearly dependent 3 ~ in R . Indeed, 0 · v1 + 1 · ~v2 = (0; 0; 0) = 0 and 1 6= 0. 2 3 (4) The functions x; x ; 5x are linearly independent in F (R; R) (try to prove this fact). n Recall that the Euclidean space R is also equipped with the dot-product 1 n 1 n 1 1 operation (x ; : : : ; x ) · (y ; : : : ; y ) = x y + ··· + xnyn. 1 n n Recall that for ~x = (x ; : : : ; x ) 2 R the norm or length of ~x is p p jj~xjj := ~x · ~x = (x1)2 + ··· + (xn)2: Thus we always have jj~xjj ≥ 0 and, moreover jj~xjj = 0 if and only if ~x = ~0. n A system of vectors ~v1; : : :~vm 2 R is called orthogonal if ~vi · ~vj = 0 for all i 6= j, 1 ≤ i; j ≤ m. n Fact: Let ~v1; : : :~vm 2 R be an orthogonal system of vectors such that ~vi 6= ~0 for i = 1; : : : ; m. Then the vectors ~v1; : : : ;~vm are linearly independent. Proof. Suppose c1; : : : ; cm 2 R are such that c1~v1 + ··· + cm~vm = ~0: Let i 2 f1; : : : ; mg be arbitrary. take the dot-product of the above equation with ~vi. Then (c1~v1 + ··· + cm~vm) · ~vi = ~0 · ~vi = 0 c1(~v1 · ~vi) + : : : cm(~vm · ~vi) = 0 n Because ~v1; : : :~vm 2 R is, by assumption, an orthogonal system in the above sum all terms ~vj · ~vi are = 0 except for the case j = i. Thus we get 2 ci(~vi · ~vi) = cijj~vijj = 0: Since, again by assumption, ~vi 6= ~0, we have jj~vijj > 0. Therefore from 2 cijj~vijj = 0 we get ci = 0. Since i 2 f1; : : : ; mg was arbitrary, we conclude that c1 = ··· = cm = 0. Thus ~v1; : : :~vm are linearly independent, as claimed. Definition. Let V be a vector space and let W ⊆ V be a subset. The subset W is called a linear subspace of V if it satisfies the following properties: (1) ~0 2 W . (2) Whenever ~v 2 W and r 2 R then r~v 2 W . 3 (3) For every ~v; ~w 2 W we have ~v + ~w 2 W . If W is a linear subspace of V , we write W ≤ V . Note that if W ≤ V then W is itself a vector space, with the operations of addition and multiplication by a scalar restricted from V . Example: 2 2 (1) The set W = f(x; y) 2 R jy = 3xg is a linear subspace of R . 2 2 (2) The set W = f(x; y) 2 R jy = 3x + 1g is a not linear subspace of R . (3) The set W = ff : R ! R : f(3) = 0g is a linear subspace of F (R; R). (4) The set W = ff : R ! R : f(3) = 2f(5)g is a linear subspace of F (R; R). (5) The set W = ff : R ! R : f is continuousg is a linear subspace of F (R; R). (6) If A 2 M2;2(R) is a 2 × 2 matrix, then 1 2 2 x1 0 ker(A) := f(x ; x ) 2 R jA = g x2 0 2 is a linear subspace of R . (7) Let V be a vector space and let S ⊆ V be a nonempty subset. The span of S is defined as: Span(S) := fr1~v1 + : : : rn~vnjn ≥ 1;~v1; : : : ;~vn 2 V; and r1; : : : rn 2 R (Note that n in the above definition is not fixed, so that Span(S) consists of all finite linear combinations of elements of S). Then Span(S) is a linear subspace of V . 3 (8) For S = f(0; 1; 2); (0; 0; −1)g ⊆ R try to prove that Span(S) = f(0; y; z)jy; z 2 R are arbitraryg. Definition. Let V be a vector space. A collection of vectors ~v1; : : : ;~vn 2 V is called a basis of V if the vectors ~v1; : : : ;~vn are linearly independent and if Span(~v1; : : : ;~vn) = V . Fact. A collection of vectors ~v1; : : : ;~vn 2 V is a basis of V if and only if for every ~v 2 V there exists a unique n-tuple of real numbers c1; : : : ; cn such that c1~v1 + ··· + cn~vn = ~v. Basic properties of bases: (1) If ~v1; : : : ;~vn 2 V and ~w1; : : : ; ~wm 2 V are bases of V then n = m. For this reason, if a vector space V admits a finite basis ~v1; : : : ;~vn 2 V , then the number n is called the dimension of V and denoted n = dim V . If a vector space V does not admit a finite basis, we set dim V := 1. (2) If ~v1; : : : ;~vm 2 V is a linearly independent collection of vectors then ~v1; : : : ;~vm is a basis of the linear subspace Span(~v1; : : : ;~vm). (3) If dimV = n < 1 and ~v1; : : : ;~vm 2 V is a linearly independent col- lection of vectors then m ≤ n and there exist vectors ~vm+1; : : :~vn 2 V such that ~v1; : : : ;~vm;~vm+1; : : :~vn 2 V is a basis of V . 4 (4) If dim(V ) = n < 1 and if W ≤ V then dim W ≤ n. Examples: n n (1) dim R = n and ~e1; : : :~en is a basis of R where ~ei = (0;:::; 1;::: 0) with 1 occurring in the i-th position. (2) dimF (X; R) = jXj, the cardinality of the set X. In particular, dimF (X; R) < 1 if and only if X is a finite set. 2 (3) dim Mn;n(R) = n . n n (4) Let ~v1; : : : ;~vn 2 R be n vectors in R and let A = [v1jv2j ::: jvn] be the n × n matrix with the i-th column being the vector ~vi. n Then ~v1; : : : ;~vn is a basis of R if and only if det(A) 6= 0. n ~ (5) Let ~v1; : : : ;~vn 2 R an orthogonal system of vectors such that ~vi 6= 0 n for i = 1; : : : ; n. Then the vectors ~v1; : : :~vn form a basis of R . Definition. Let V and W be vector spaces. A function T : V ! W is called a linear map if for every ~v1;~v2 2 V and r1; r2 2 R we have T (r1~v1 + r2~v2) = r1T (~v1) + r2(~v2): Basic facts: (1) If T : V ! W is a linear map then T (~0) = ~0.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages5 Page
-
File Size-