<<

Homogeneous equations,

1. Homogeneous equations:

Ex 1: Consider system:

B#B"# œ! B"  #B3 œ ! BBœ!#$ equation:

"#! B" ! Ô×Ô×Ô×"!#B# œœÞ ! 0 ÐÑ3 ÕØÕØÕØ!""B$ !

Homogeneous equation: Eœx0. At least one solution: x0œ Þ Other solutions called nontrivial solutions.

Theorem 1: A nontrivial solution of Ð$Ñ exists iff [if and only if] the system has at least one free variable in . The same is true for any homogeneous system of equations.

Proof: If there are no free variables, there is only one solution and that must be the trivial solution. Conversely, if there are free variables, then they can be non-zero, and there is a nontrivial solution.

Ex 2: Reduce the system above: "#!l! "#!l! as before Ô×Ô×"!#l! Ä !""l! ÕØÕØ!""l! !!!l! Ê

B"# #B œ!à B #$ B œ!à !œ!Þ

Note that Bœ$ free variable (non-pivot); hence general solution is

BœBà#$"#$ Bœ#Bœ#BÞ

B#B#"$ x œœÔ×ÔBB#$ × œBß$ Ô " × ÕØÕBB$$ Ø Õ " Ø

Parametric vector form of solution.

B$ arbitrary: straight line -

Theorem 2: A homogeneous system always has a nontrivial solution if the number of equations is less than the number of unknowns.

Pf: If we perform a on the system, then the reduced augmented matrix has the form: "+"# + "$ á l! Ô×!! "+ á l! ÖÙ#% ÖÙ!!!"+álã$& ÖÙ ÖÙ00000álã ÕØãl! with the remaining rows zeroes on the left side. If the number of equations is less than the number of unknowns, then not every column can have a 1 in it, so there are free variables. By previous theorem, there are nontrivial solutions. 1. Inhomogeneous equations:

[we should briefly mention the relationship between homogeneous and inhomogeneous equations:]

Consider general system: EœxbÞ (1)(1) Suppose ppbx is a particular solution of (1), so Eœ . If is any other solution of (1), we still have Eœxb . Subtracting the two equations: EEœxp0 Ê EÐÑœÞ xp0

So vxp2 œ  satisfies the homogeneous equation. Generally:

Theorem 1: M0 p is a particular solution of (1) , then for any other solution x , we have that vxp2 œ solves the homogeneous equation (i.e., with b0 œ ). Thus every solution xxpvv of (1) can be written œ22, where is a solution of the homogeneous equation.

2. Application: Network flows

Traffic pattern at Drummond Square: Quantities in cars/min. What are the flows on the inside streets? One equation for each node:

BBBœ%!"$%

B"#  B œ #!!

BBBœ"!!#$&

BBœ'!% &

" ! " " ! l %! Ô×" " ! ! ! l #!! ÖÙ ÖÙ! "1 ! " l "!! ÕØ!!!""l'! " ! " " ! l %! Ô×!"""! l"'! ÖÙ ÖÙ! "1 ! " l "!! ÕØ!!!""l'!

"!""! l %! Ô×!"""!l"'! ÖÙ ÖÙ!"1 ! "l"!! ÕØ!!!""l'!

"!""! l %! Ô×!"""!l"'! ÖÙ ÖÙ! !0 " " l '! ÕØ!!!""l'!

"!""!l%! Ô×!" " " !l"'! ÖÙ ÖÙ!! ! " "l '! ÕØ!! ! " "l '!

"!"!" l"!! Ô×! " " ! " l "!! ÖÙ ÖÙ!!!""l'! ÕØ!!!!!l!

So:

B"$& œ"!!B B

B#$& œ"!!B B

Bœ'!Bß%& where BßB$& are free. Constraint: if for example all flows have to be positive; then we require B !3 for all 3 . Therefore:

BßB !$&

"!! Ÿ B$&  B Ÿ "!!

BŸ'!&

This corresponds to a region in the BßB$& plane - can be plotted if desired.

If they closed off road B$& and Bß then we have B $& œB œ! , so that

Bœ"#%100 ßBœ" 00, Bœ'! note that then traffic flow becomes uniquely determined.

Definition 1: A collection of vectors vv"#ßßáß v 8 is linearly independent if no vector in the collection is a linear combination of the others.

Equivalently,

Definition 2: A collection of vectors vv"8ßáß is linearly independent if the only way we can have --á-œ""vv ## 88 v0 is if all of the -œ!Þ 3

Equivalence of the definitions: Def 1 Ê Def 2

If no vector is a linear combination of the others, then if

--á-œ""vv ## 88 v0

we will show that -ßáß-"8 have also to be 0 .

Proof: Suppose not (for contradiction). Without loss of generality, assume -Á!" (proof works same way otherwise). Then we have:

vv"#"#8"8œ-Î- á-Î- v ß contradicting that no vector is a combination of the others. Thus the -3 all have to be 0 as desired.

Note: If WWWß#" is a collection of vectors and is a subcollection of 2 then

If W# is linearly independent

ÊW no vector in # is a linear combination of the others

ÊW no vector in " is a linear combination of the others (since every vector in WW"# is also in )

ÊW " is linearly independent.

Logically equivalent [contrapositive]

If W" is linearly dependent (i.e., not independent)

ÊW # is linearly dependent

[.]These are stated more formally in the book as theorems

. Theorem 2: P/> W œ Övv"8ßáß × be a collection of vectors in ‘ . Then W is linearly dependent if and only if one of the vectors v3 is a linear combination of the previous ones vv"3"ßáß Þ

Proof: (ÊW ) If is linearly dependent, then there is a set of constnats -3 not all ! such that

-á-œ""vv0 88 .

Let -5 be the last non-zero coefficient. Then the rest of the coefficients are zero, and

--á-""vv ## 5"5" v -œ 55 v0

Ä vvv5 œ - " Î- 5" - # Î- 5 # á - 5"55" Î- v i.e. one of the vectors is a linear combination of the previous ones. (É ) Obvious. 3. Checking for linear independence:

Example 2: Consider the vectors "" " vv"#œßœ ßœ v $ ”•""! ”• ”• Are they linearly independent?

---œ""vvv0 ## $$ Ê

---œ!"#$

--"# œ! Reduced matrix:

"""l! ”•""! l!

"""l! Ä ”•!#" l!

"" "l! Ä ”•!""Î#l!

Conclude: there are free variables. By theorem on homogeneous equations there is a nontrivial solution, so -!3 need not be .

Thus not all -!Ê3 must be not linearly independent.

[note that if number of vectors is greater than the size of the vectors, this will always happen].

More generally thus: Theorem 3: In ‘8 , if we have more than 8 vectors, they cannot be linearly independent.

From above we have:

Algorithm: To check whether vectors are linearly independent, form a matrix with them as columns, and row reduce. (a) If reduced matrix has free variables (i.e., b a non-pivot column), then they are not independent. (b) If there are no free variables (i.e., there are no nonpivot columns), they are independent.