Linear Algebra II: vector spaces

Math Tools for Neuroscience (NEU 314) Spring 2016

Jonathan Pillow Princeton Neuroscience Institute & Psychology.

Lecture 3 (Tuesday 2/9) accompanying notes/slides discussion items

• Up now on piazza: • Chapter 2 of Wallisch et al (Matlab for Neuroscientistis) • homework 0 (Matlab basics).

• This week in lab: do interactive lab (“lab 1” linked from website) and then work on homework 0. today’s topics

• linear projection • / dependence • • subspace • Note that the result is a scalar.

This operation has an equivalent geometric w definition (general proof a bit tricky):

⃗v · w⃗ ≡ ||⃗v|| || w⃗|| cos (φvw), φvw where φvw is the angle between the two vec- v tors. Thus, the inner product of two perpen- b dicular vectors is 0, the inner product of two b parallel vectors is the product of their norms, w .v = .. w . v and the inner product of a vector with itself w is the square of its .

The inner product is distributive over addition: ⃗v · (w⃗ + ⃗y)=⃗v · w⃗ +⃗v · ⃗y. The symmetry of the definition means that the operation is also commutative (i.e., order doesn’t matter): ⃗v·w⃗ = w⃗ ·⃗v. Despite this symmetry, it is often useful to think of one of the vectors in an inner product as an operator, and the other as an input. For example, the inner product of any vector ⃗v with the vector: 1 1 1 1 w⃗ =(N N N ··· N ) gives the average of the components of ⃗v. The inner product of vector ⃗v with the vector w⃗ =(100linear··· 0)projection is the first component, v1.• intuitively, dropping a vector down onto a linear surface at a right angle • if u is a unit vector, Furthermore, the inner productlength of of a projection vector is ⃗v with a unit vector uˆ has a nice geometric v interpretation. The cosine implies that the inner product is the length of the component of ⃗v lying along the line in the di- ( . ^) ^

v } u u rection of uˆ. This component, which is writ- u^ ten as (⃗v · uˆ)ˆu, is referred to as the projection of the vector onto the line. The difference (or component of v residual) vector, ⃗v − (⃗v · uˆ)ˆu,isthecompo- in direction of u nent of ⃗v perpendicular to the line. Note that the residual vector is always• for perpendicular non-unit vector, length of projection = to the projection vector, and that their sum is ⃗v [prove].

3 Note that the result is a scalar.

This operation has an equivalent geometric w definition (general proof a bit tricky):

⃗v · w⃗ ≡ ||⃗v|| || w⃗|| cos (φvw), φvw where φvw is the angle between the two vec- v tors. Thus, the inner product of two perpen- b dicular vectors is 0, the inner product of two b parallel vectors is the product of their norms, w .v = .. w . v and the inner product of a vector with itself w is the square of its norm.

The inner product is distributive over addition: ⃗v · (w⃗ + ⃗y)=⃗v · w⃗ +⃗v · ⃗y. The symmetry of the definition means that the operation is also commutative (i.e., order doesn’t matter): ⃗v·w⃗ = w⃗ ·⃗v. Despite this symmetry, it is often useful to think of one of the vectors in an inner product as an operator, and the other as an input. For example, the inner product of any vector ⃗v with the vector: 1 1 1 1 w⃗ =(N N N ··· N ) gives the average of the components of ⃗v. The inner product of vector ⃗v with the vector w⃗ =(100orthogonality··· 0) is the first component, v1. • two vectors are orthogonal (or “perpendicular”) if their is zero:

Furthermore, the inner product of a vector ⃗v with a unit vector uˆ has a nice geometric v interpretation. The cosine equationcomponent implies of v that the inner product is the lengthorthogonal of the to u component of ⃗v lying along the line in the di- ( . ^) ^

v } u u rection of uˆ. This component, which is writ- u^ ten as (⃗v · uˆ)ˆu, is referred to as the projection of the vector onto the line. The difference (or component of v residual) vector, ⃗v − (⃗v · uˆ)ˆu,isthecompo- in direction of u nent of ⃗v perpendicular to the line. Note that the residual vector is always• Can perpendicular decompose any vector into its component along to the projection vector, and thatu and their its residual sum is (orthogonal) component. ⃗v [prove].

3 linear combination is clearly a vector space [verify]. • scaling and summing applied to a group of vectors

Working backwards, a set of vectors is said to span a vector space if one can write any v vector in the vector space as a linear com- 1 v3 bination of the set. A spanning set can be redundant: For example, if two of the vec- tors are identical, or are scaled copies of each other. This redundancy is formalized by defining linear• a independence group of vectors.Asetofvec- is linearly tors {⃗v1,⃗v2,...⃗vdependentM } is linearly independent if one can if be written as v2 (and only if) thea only linear solution combination to the equation of the others • otherwise,αn⃗vn =0 linearly independent !n is αn =0(for all n).

A basis for a vector space is a linearly in- dependent spanning set. For example, con- sider the of this page. One vector is not enough to span the plane. Scalar multi- v ples of this vector will trace out a line (which v v v 2 is a subspace), but cannot “get off the line” vv 2 1 v v to cover the rest of the plane. But two vec- 1 v v tors are sufficient to span the entire plane. 1 Bases are not unique: any two vectors will do, as long as they don’t lie along the same line. Three vectors are redundant: one can always be written as a linear combination of the other two. In general, the vector space N R requires a basis of size N.

e Geometrically, the basis vectors define a set of coordinate axes for the space (although e ˆ2 they need not be perpendicular). The stan- e dard basis is the set of unit vectors that lie along the axes of the space:

1 0 0 eˆ1 ⎛ 0 ⎞ ⎛ 1 ⎞ ⎛ 0 ⎞ eˆ1 = ⎜ 0 ⎟, eˆ2 = ⎜ 0 ⎟,...eˆN = ⎜ 0 ⎟. ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ v ⎜ 0 ⎟ ⎜ 0 ⎟ ⎜ 1 ⎟ S x ⎝ ⎠ ⎝ ⎠ ⎝ ⎠ 5 x S e ) v x e x S x S e

x e v x S e S x) v x e x S x S e

x e

o o o o o

3

o o o o o

3 is clearly a vector space [verify].

Working backwards, a set of vectors is said to span a vector space if one can write any v vector in the vector space as a linear com- 1 v3 bination of the set. A spanning set can be redundant: For example, if two of the vec- tors are identical, or are scaled copies of each other. This redundancy is formalized by defining linear independence.Asetofvec- tors {⃗v1,⃗v2,...⃗vM } is linearly independent if v2 (and only if) the only solution to the equation

αn⃗vn =0 !n is αn =0(for all n). vector space A basis for a vector space is a linearly in- dependent spanning set. For example, con- • set of all points that can be obtained by linear sider the plane of this page. One vector is combinations of some set of “basis” vectors not enough to span the plane. Scalar multi- v ples of this vector will trace out a line (which v v v 2 is a subspace), but cannot “get off the line” vv 2 1 v v to cover the rest of the plane. But two vec- 1 v v tors are sufficient to span the entire plane. 1 Bases are not unique: any two vectors will do, as long as they don’t lie along the same 1D vector space Two different (orthonormal) line. Three vectors are redundant: one can spanned by single bases for the same 2D always be written as a linear combination of basis vector vector space the other two. In general, the vector space N R requires a basis of size N.

e Geometrically, the basis vectors define a set of coordinate axes for the space (although e ˆ2 they need not be perpendicular). The stan- e dard basis is the set of unit vectors that lie along the axes of the space:

1 0 0 eˆ1 ⎛ 0 ⎞ ⎛ 1 ⎞ ⎛ 0 ⎞ eˆ1 = ⎜ 0 ⎟, eˆ2 = ⎜ 0 ⎟,...eˆN = ⎜ 0 ⎟. ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ v ⎜ 0 ⎟ ⎜ 0 ⎟ ⎜ 1 ⎟ S x ⎝ ⎠ ⎝ ⎠ ⎝ ⎠ 5 x S e ) v x e x S x S e

x e v x S e S x) v x e x S x S e

x e

o o o o o

3

o o o o o

3 is clearly a vector space [verify].

Working backwards, a set of vectors is said to span a vector space if one can write any v vector in the vector space as a linear com- 1 v3 bination of the set. A spanning set can be redundant: For example, if two of the vec- tors are identical, or are scaled copies of each other. This redundancy is formalized by defining linear independence.Asetofvec- tors {⃗v1,⃗v2,...⃗vM } is linearly independent if v2 (and only if) the only solution to the equation

αn⃗vn =0 !n is αn =0(for all n). vector space A basis for a vector space is a linearly in- dependent spanning set. For example, con- • set of all points that can be obtained by linear sider the plane of this page. One vector is combinations of some set of “basis” vectors not enough to span the plane. Scalar multi- v ples of this vector will trace out a line (which v v v 2 is a subspace), but cannot “get off the line” vv 2 1 v v to cover the rest of the plane. But two vec- 1 v v tors are sufficient to span the entire plane. 1 Bases are not unique: any two vectors will do, as long as they don’t lie along the same 1D vector space Two different (orthonormal) line. Three vectors are redundant: one can (subspace of R2) bases for the same 2D always be written as a linear combination of vector space the other two. In general, the vector space N R requires a basis of size N.

e Geometrically, the basis vectors define a set of coordinate axes for the space (although e ˆ2 they need not be perpendicular). The stan- e dard basis is the set of unit vectors that lie along the axes of the space:

1 0 0 eˆ1 ⎛ 0 ⎞ ⎛ 1 ⎞ ⎛ 0 ⎞ eˆ1 = ⎜ 0 ⎟, eˆ2 = ⎜ 0 ⎟,...eˆN = ⎜ 0 ⎟. ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ v ⎜ 0 ⎟ ⎜ 0 ⎟ ⎜ 1 ⎟ S x ⎝ ⎠ ⎝ ⎠ ⎝ ⎠ 5 x S e ) v x e x S x S e

x e v x S e S x) v x e x S x S e

x e

o o o o o

3

o o o o o

3 is clearly a vector space [verify].

Working backwards, a set of vectors is said to span a vector space if one can write any v vector in the vector space as a linear com- 1 v3 bination of the set. A spanning set can be redundant: For example, if two of the vec- tors are identical, or are scaled copies of each other. This redundancy is formalized by defining linear independence.Asetofvec- tors {⃗v1,⃗v2,...⃗vM } is linearly independent if v2 (and only if) the only solution to the equation

αn⃗vn =0 !n is αn =0(for all n). basis A basis for a vector space is a linearly in- dependent spanning set. For example, con- • set of vectors that can “span” (form via linear combination) all points in a vector space sider the plane of this page. One vector is not enough to span the plane. Scalar multi- v ples of this vector will trace out a line (which v v v 2 is a subspace), but cannot “get off the line” vv 2 1 v v to cover the rest of the plane. But two vec- 1 v v tors are sufficient to span the entire plane. 1 Bases are not unique: any two vectors will do, as long as they don’t lie along the same 1D vector space Two different (orthonormal) line. Three vectors are redundant: one can (subspace of R2) bases for the same 2D always be written as a linear combination of vector space the other two. In general, the vector space N R requires a basis of size N.

e Geometrically, the basis vectors define a set of coordinate axes for the space (although e ˆ2 they need not be perpendicular). The stan- e dard basis is the set of unit vectors that lie along the axes of the space:

1 0 0 eˆ1 ⎛ 0 ⎞ ⎛ 1 ⎞ ⎛ 0 ⎞ eˆ1 = ⎜ 0 ⎟, eˆ2 = ⎜ 0 ⎟,...eˆN = ⎜ 0 ⎟. ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ . ⎟ ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ v ⎜ 0 ⎟ ⎜ 0 ⎟ ⎜ 1 ⎟ S x ⎝ ⎠ ⎝ ⎠ ⎝ ⎠ 5 x S e ) v x e x S x S e

x e v x S e S x) v x e x S x S e

x e

o o o o o

3

o o o o o

3 orthonormal basis

• basis composed of orthogonal unit vectors