Signal Def, Vector space, V over a (scalar),F

• Vector , VxV->V, and scalar FxV- >V, plus •8 axioms – 1) Commutativity of addition – 2) Associativity of addition – 3) Identity: There exists an element of V, denoted by 0, such that vv0v,+0=v, for any element v of V. – 4) Inverse: For each element v of V, there exists an element, which we denote as –v, such that v + (-v) = 0 – 5))y Scalar associativity –6) 1v=v – 7) a(v+u)=av+au – 8) (ab)vavbv (a+b)v=av+bv Uniqueness of the identity

Proposition: Let vw , be elements of V such that v+= w v . Then, w = 0. PfProof: vwv+= ()−++=−+vvw () vv 00+=w w = 0 Q.E.D. This proposition implies that the identity is unique.

Proposition: Let vw , be elements of V such that v+= w 0. Then, w =− v . Proof: vw+=0 ()−++=−+vvw ()0 v 0 +=−wv wv=− Q.E.D. This proposition implies that for each v , its inverse is unique. Basic :

• A group is defined as a of elements G and a , denoted by · for which the followinggp pro perties are satisfied – For any element a, b, in the set, a·b is in the set. – The associative law is satisfied; that is for a,b,c in the set (a·b)·c= a·(b·c) – There is an , e, in the set such that a·e= e·a=a for all a in the set. – For each element a in the set, there is an inverse element a-1 in the set satisfying a· a-1 = a-1 ·a=e. Group: example

• A set of non-singular n×n matrices of real numbers, with multiplication • Note; the operation does not have to be commutative to be a Group. • Example of non-group: a set of non- negative , with + Unique identity? Unique inverse of eachlh element ? • a·xxa=a. Then, a-1·a·x=a-1·aae=e, so xxe=e. • x·a=a

• a·x=e. Then, a-1·a·x=a-1·e=a-1, so x=a-1.

• If the operation is commutative , the group is an Abelian group. – The set of m×n real matrices, with + . – The set of integers, with + . Application?

• In channel coding (for error correction or error detection). Algebra: field

• AfildiA field is a se t o ftf two or more e lemen ts closed under two operations, + (addition) and * (multiplication) with the following properties – F is an Abelian group under addition – The set F−{0} is an Abelian group under multipp,lication, where 0 denotes the identity under addition. – The distributive law is satisfied: (α+β)∗γ = α∗γ+β∗γ Immediately following properties

Proposition: α∗β=0 implies α=0 or β=0 Proposition: For any non-zero α, α∗0= 0 PfProof: α∗0 + α = α∗00 + α ∗11= α∗(0 +1) = α∗11=α; therefore α∗0 =0 Propos ition: 000∗0 =0 Proof: For a non-zero α, its is non-zero. 00(0∗0=(α+((− α) )0)∗0 = α∗00(+(− α)∗00 =0+0=0 Examples:

• the set of real numbers • The set of complex numbers • La ter, fi n ite fie lds (Ga lo is fie lds ) w ill be studied for channel coding – E.g., {0,1} with + (exclusive OR), * (AND) Vector space

• A vector space V over a given field F is a set of elements (called vectors) closed under and operation + called vector addition. There is also an oppp,eration * called scalar multiplication, which operates on an element of F (called scalar) and an element of V to produce an element of V. The following properties are satisfied: – V is an Abelian group under +. Let 0 denote the additive identity. – For every v,w in V and every α,β in F, we have •(α∗β)∗v= α∗(β∗v) •(α+β)∗v= α∗v+β∗v α∗( v+w)=α∗v+ α ∗w •1*v=v Examples of vector space

• Rn over R •Cn over C

• L2 over RLR, L2 over C Subspace.

Let VSV be a vector space and ⊂ . If SV is also a vector space with the same operations as , thSilldbhen S is called a subspace of V.

SiS is a sub space if vw,∈⇒+ S av bw ∈ S Linear independence of vectors

Def)

A set of vectors vv12 , ,..., vn ∈ V are linearly independent iff Basis

CidtConsider vector space VF over ( (fild)a field). We say that a set (finite or infinite) BV⊂ is a basis, if

* for every finite subset BB00⊂ , the vectors in B are linearly independent, and * for every x ∈V ,

it is possible to choose aaFvvB11 , ..., nn∈∈ and , ...,

such that xav = 11 ++ ... avnn .

The sums in the above definition are all finite because without additi onal structure th e axi oms ofdif a vector space do not permit us to meaningfully speak about an infinite sum of vectors. Vector space

A set of vectors vv12, ,... vn ∈ V is said to span Vif every vector uV∈ is a linear combination of vv12 , , ..., vn .

Example: Rn Finite dimensional vector space

• A vector space V is finite dimensional if

there is a finite set of vectors u1, u2, …, un that span V. Finite dimensional vector space

Let V be a finite dimensional vector space. Then

•If vv12, ,..., vm are linearly independent but do not span V, then V has a basis with nnmvvv vectors (> ) that include 12 , , m .

•If vv12 , , ..., vm span V and but are linearly dependent, then a subset of vv12 , ,..., vm is a basis for V with n vectors ( n< m ) .

•Every basis of V contains the same number of vectors.

Dimension of a finiate dimensional vector space. Example: Rn and its Basis Vectors

••• Inner product space: for length and angle

Vector space VC over . Inner product is a mapping VV×→ C such that 1)=< vu , >* 2)<+αβuvw , α >=<>+<> β uw , vw , (Conequently, < wu ,αβ+>=< v α β** wu , >+< wv , > ) 3)<>≥uu , 0, with equality if and only if v = 0. Example: R n

•••

•••

••• ••• Follows the notion of orthogonality, norm (i)(metric) Def) Orthonormal set and projection theorem

Def) A non-empty subset S of an inner product space is said to be orthonormal iff 1) ∀∈xS, < xx, >= 1 and 2) If xy ,∈≠ S and x y , then <>= xy , 0. Projection onto a finite dimensional subspace

Def) If SVuVuS is a subspace of an inner product space , and ∈ , the projection of on

is defined to be uV∈∈ a vector uSS S such that uu − is orthogonal to all vectors in S .

Projection Theorem (Gallager Thm 5.1)

Let SV be an n-demensional subspace of an inner product space and assume that φ1 ," ,φn

is an orthonormal basis of S . Then , any uV∈=+may be composed as uuS u⊥S where

uSS ∈=∀∈ and us⊥S , 0 sS . Furthermore, u S is uniquely determined by n uu= ,.φφ S ∑ j=1 jj Projection onto a finite dimensional subspace

222 Form Pythagorean theorem uu= S + u⊥S .

2 2 Norm bounds: 0≤ uuS ≤ with equality on the right iff uS∈ and equality on thh left iff uS is orthogonal to all vectors in .

n 2 Bessel's inequality: 0≤ u ,φ ≤ u 2 ∑ j=1 j with equality on the right iff uS∈ and equality on thh left iff uS is orthogonal to all vectors in .

Least squared error property: uu−≤−∀∈S us , sS Gram –Schmidt orthonormalization

Consider linearly independent ssV1 , ..., n ∈ , and inner product space.

We can construct an orthonormal set {}φφ1 , ...,n ∈V so that

span { s11 , ..., snn }= span{φφ , ..., } Gram-Schmidt Orthog. Procedure Examples

• Euclidian space, Rn , (over R) •Cn, (over C, over R) • L2 o f real func tions, (over R) • L2 of complex functions, (over C, over R) • Real random variables, (over R) • … more abstract examples… Application: Detection in Gaussian noise AdiA2Appendix A.2 Application: linear least square estimation of randtXdom parameter X

YhXW11=+ 1

YhXW22=+ 2

XW and ii are uncorrelated. EXEW()= ()= 0 Linear least square estimation of X . Application: linear least square estiiimation o fdf random parameters

YHXW=+

XWij andld0d are uncorrelated. EXEWij( i)= ( j)= 0,,∀ Design linear least square estimator xyˆ ()= Ay s.t.

n 2 EXxY{}− ˆ ( ) is minimized. ∑i=1 ii Application: linear least square estiiimation o fdf random parameters

YHXW=+

XWij and are uncorrelated . EXEWij( i)==∀( j) 0 ,, Design linear least square estimator xyˆ ()= Ay s.t.

n 2 EXYEX{ − xˆ (Y )} iiiidis minimized. ∑i=1 ii

Concept of sufficient information: proje ction onto the direction of H. (Start the discussion with the real randomvariables first.)