Information Measures & Information Diagrams
Reading for this lecture: CMR articles Yeung & Anatomy
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 1 Information Measures ...
One random variable: X Pr(x) ⇠ Use paradigm: One shot sampling.
Recall information theory quantity:
Entropy: H[X]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 2 Information Measures ...
Two random variables: X Pr(x) ⇠ Y Pr(y) ⇠ (X, Y ) Pr(x, y) ⇠
Recall information theory quantities: Joint entropy: H[X, Y ] Conditional Entropies: H[X Y ] H[Y X] Mutual Information: I[X|; Y ] | Information Metric: d(X, Y )
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 3 Information Measures ... Event Space Relationships of Information Quantifiers:
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers:
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers:
H(Y ) H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X, Y )
H(Y ) H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X, Y )
H(Y ) H(X Y ) H(X) |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X, Y )
H(Y ) H(Y X) H(X Y ) H(X) | |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X, Y )
H(Y ) H(Y X) I(X; Y ) H(X Y ) H(X) | |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X, Y )
H(Y ) H(Y X) I(X; Y ) H(X Y ) H(X) | |
d(X; Y )
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 4 Information Measures ...
Just a cute mnemonic?
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 5 Information Measures ...
Just a cute mnemonic?
No, much more structure there.
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 5 Information Measures ...
Three random variables: X Pr(x) ⇠ Y Pr(y) ⇠ Z Pr(z) (X, Y, Z) ⇠ Pr(x, y, z) ⇠ Information theory quantities: Joint entropy: H[X, Y, Z] Conditional entropies:
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 6 Information Measures ...
Three random variables: X Pr(x) ⇠ Y Pr(y) ⇠ Z Pr(z) (X, Y, Z) ⇠ Pr(x, y, z) ⇠ Information theory quantities: Joint entropy: H[X, Y, Z] Conditional entropies: H[X Y,Z] H[Y X, Z] H[Z X, Y ] H[X,| Y Z] H[X,| Z Y ] H[Y,Z| X] | | |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 6 Information Measures ...
Three random variables: X Pr(x) ⇠ Y Pr(y) ⇠ Z Pr(z) (X, Y, Z) ⇠ Pr(x, y, z) ⇠ Information theory quantities: Joint entropy: H[X, Y, Z] Conditional entropies: H[X Y,Z] H[Y X, Z] H[Z X, Y ] H[X,| Y Z] H[X,| Z Y ] H[Y,Z| X] | | | Conditional mutual information: I[X; Y Z] |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 6 Information Measures ...
Three random variables: X Pr(x) ⇠ Y Pr(y) ⇠ Z Pr(z) (X, Y, Z) ⇠ Pr(x, y, z) ⇠ Information theory quantities: Joint entropy: H[X, Y, Z] Conditional entropies: H[X Y,Z] H[Y X, Z] H[Z X, Y ] H[X,| Y Z] H[X,| Z Y ] H[Y,Z| X] | | | Conditional mutual information: I[X; Y Z] | Mutual information? I [ X ; Y ; Z ] ? Information metric?
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 6 Information Measures ...
Three random variables: X Pr(x) ⇠ Y Pr(y) ⇠ Z Pr(z) (X, Y, Z) ⇠ Pr(x, y, z) ⇠ Information theory quantities: Joint entropy: H[X, Y, Z] Conditional entropies: H[X Y,Z] H[Y X, Z] H[Z X, Y ] H[X,| Y Z] H[X,| Z Y ] H[Y,Z| X] | | | Conditional mutual information: I[X; Y Z] | Mutual information? I [ X ; Y ; Z ] ? Information metric? How to visualize? A roadmap for information relationships?
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 6 Information Measures ...
N random variables? (X ,X ,...,X ) Pr(x ,x ,...,x ) 1 2 N ⇠ 1 2 N
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 7 Information Measures ...
N random variables? (X ,X ,...,X ) Pr(x ,x ,...,x ) 1 2 N ⇠ 1 2 N
Agenda:
Mechanics of information measures for N variables
Graphical information diagram for processes
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 7 Information Measures ... The Shannon Information Measures: Entropy, conditional entropy, mutual information, and conditional mutual information
All information measures can be expressed as linear combination of entropies. E.g., H[X Y ]=H[X] H[X, Y ] | I[X; Y ]=H[X]+H[Y ] H[X, Y ] d(X, Y )=H[X Y ]+H[Y X] | |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 8 Information Measures ... The Shannon Information Measures ...
No fundamental difference btw entropy H & mutual information I .
In fact, entropy can be introduced as a “self-mutual information”: H[X]=I[X; X]
So, there’s really a single quantity being referenced.
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 9 Information Measures ... The Shannon Information Measures ... I-Measure: I and H form a signed measure over event space.
Set variables for event space: X corresponds to random variable X Y corresponds to random variable Y Universale set: ⌦=X Y e [ Over σ-field of atoms: e e = (X Y ), X,Y,(X Y ), (X Y c), (Xc Y ), (X Y )c, F [ \ \ \ \ ; n o Real-valuede measuree e e for eatoms:e µe⇤ eR e e e e 2
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 10 Information Measures ... The Shannon Information Measures ... I-Measure ... defined By the mappings: µ⇤(X Y )=H[X, Y ] [ µ⇤(X)=H[X] e e µ⇤(Y )=H[Y ] e µ⇤(X Y )=I[X; Y ] \ e µ⇤(X Y )=H[X Y ] e e | µ⇤(Y X)=H[Y X] | e ec µ⇤((X Y ) )=H[X Y ]+H[Y X] e\ e | | µ⇤( )=0 e e ; Where: X Y = X Y c \ Y X = Xc Y Lecture 19: Natural Computation & eSelf-Organization, e Physics 256Be (Winter\e 2014); Jim Crutchfield Monday, March 10, 14 11 e e e e Information Measures ... The Shannon Information Measures: I-Measure ... Roadmap: Information measures to set-theoretic operations
H and I µ⇤ ! , ![ ; !\ |!
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 12 Information Measures ... The Shannon Information Measures: I-Measure ... For example, I[X; Y ]=H[X]+H[Y ] H[X, Y ] or µ⇤(X Y )=µ⇤(X)+µ⇤(Y ) µ⇤(X Y ) \ [ map to the set identity e e e e e e X Y = X Y (X Y ) (Notation ambiguous.) \ [ [ That is, e e e e e e = +
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 13 Information Measures ... The Shannon Information Measures: I-Measure ...
Benefits: Extends to N variables Makes explicit underlying structure of information theory Graphical “calculus” of information identities Often easier to work with, sometimes not Often easier to discover new quantities: e.g., N-variable mutual informations ... stay tuned
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 14 Information Measures ... The Shannon Information Measures: I-Measure for N variables:
Definition: = σ-field generated by W = Xi : i =1,...,N F { } (Drop tildes on set variables!) N
Universal set: ⌦= Xi i [ N Atoms: A : A = Y ,Y= X or Xc 2F i i i i i=1 \ A =2N 1 (Max number of atoms) || || N 2 1 (Max number of sets of atoms) =2 ||F||
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 15 Information Measures ... The Shannon Information Measures: Theorem: I-Measure for N random variables:
G, G0,G00 W = X ,i=1,...N 2 { i }
µ⇤ X H[X, X G] ⌘ 2 X G ! [2
µ⇤ X Y H[X, X G Y,Y G0] ⌘ 2 | 2 X G ! Y G !! [2 [2 0
µ⇤ X Y I[X, X G; Y,Y G0] ⌘ 2 2 X G ! Y G !! [2 \ [2 0
µ⇤ X Y Z I[X, X G; Y,Y G0 Z, Z G00] ⌘ 2 2 | 2 X G ! Y G ! Z G !! [2 \ [2 0 [2 00
Unique measure on consistent with information measures. F Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 16 Information Measures ... The Shannon Information Measure ... I-Diagram: Venn-like graphic for I-measure relations Entropy ~ area Conditioning ~ area removal Mutual information ~ intersection
Signed measure: Areas can represent µ ⇤ < 0 .
However, area is zero, if µ ⇤ ( A )=0 ,A ⌦ . ⇢
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 17 Information Measures ... The Shannon Information Measure ... I-Diagram for two random variables:
X Pr(x) ⇠ Y Pr(y) ⇠ (X, Y ) Pr(x, y) ⇠ Seven information measures: H[Y ] H[X] H[X, Y ] H[X Y ] I[X; Y ] H[Y X] H[X Y ]+H[Y X] | | | | Three atoms: H[X Y ] I[X; Y ] H[Y X] | |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 18 Information Measures ... Event Space Relationships of Information Quantifiers:
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 19 Information Measures ... Event Space Relationships of Information Quantifiers:
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 19 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 19 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 19 Information Measures ... Event Space Relationships of Information Quantifiers:
H(Y ) H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 19 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X, Y )
H(Y ) H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 19 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X, Y )
H(Y ) H(X Y ) H(X) |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 19 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X, Y )
H(Y ) H(Y X) H(X Y ) H(X) | |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 19 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X, Y )
H(Y ) H(Y X) I(X; Y ) H(X Y ) H(X) | |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 19 Information Measures ... Event Space Relationships of Information Quantifiers:
H(X, Y )
H(Y ) H(Y X) I(X; Y ) H(X Y ) H(X) | |
d(X; Y )
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 19 Information Measures ... The I-Diagram ... Three random variables: X Pr(x) ⇠ Y Pr(y) (X, Y, Z) Pr(x, y, z) ⇠ ⇠ Z Pr(z) ⇠ Information measures: H[X] H[Y ] H[Z] ... I[X; Y ; Z] ... H[X, Y, Z]
7 atomic information measures.
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday,??? March atomic 10, 14 information measures: 20 Information Measures ... Information diagram for three random variables:
H[Z]
H(Y ) H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 21 Information Measures ... Information diagram for three random variables:
H[Z]
H[Z X, Y ] |
H[Y X, Z] H[X Y,Z] | | H(Y ) H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 21 Information Measures ... Information diagram for three random variables:
H[Z]
H[Z X, Y ] |
I[Y ; Z X] I[X; Z Y ] | |
H[Y X, Z] I[X; Y Z] H[X Y,Z] | | | H(Y ) H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 21 Information Measures ... Information diagram for three random variables:
H[Z]
H[Z X, Y ] |
I[Y ; Z X] I[X; Z Y ] | |
I[X; Y ; Z] H[Y X, Z] I[X; Y Z] H[X Y,Z] | | | H(Y ) H(X)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 21 Information Measures ... Information diagram for three random variables ... 3-way mutual information is symmetric:
From diagram:
I[X; Y ; Z]=I[X; Y ] I[X; Y Z] | = I[Y ; Z] I[Y ; Z X] | = I[X; Z] I[X; Z Y ] |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 22 Information Measures ... Information diagram for three random variables ... Three-way mutual information can be negative! Consider:
X Pr(X = 0) = Pr(X = 1) = 1/2 ⇠ Y Pr(Y = 0) = Pr(Y = 1) = 1/2 ⇠ X Y ? Z =(X + Y )mod2
Recall RRXOR Process:
Xt+2 =(Xt+1 + Xt)mod2,
Xt and Xt+1 Bernoulli(1/2)
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 23 Information Measures ... Information diagram for three random variables ... Three-way mutual information can be negative! Calculate: I[X; Y ]=0
I[X; Y Z]=H[X Z] H[X Y,Z] | | | =1 0 =1
I[X; Y ; Z]=I[X; Y ] I[X; Y Z] | = 1 Shannon information measure is a signed measure!
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 24 Information Measures ... Information diagram for three random variables ... Three-way mutual information can be negative! 0
1 1 1 0 1 0
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 25 Information Measures ... Information diagram for three random variables ... Three-way mutual information can be negative! H[Y ]=1 0
1 1 1 0 1 0
H[X]=1 H[Z]=1
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 25 Information Measures ... Information diagram for three random variables ... Three-way mutual information can be negative! H[Y ]=1 0 I[X; Y ]=0 I[Y ; Z]=0 1 1 1 0 1 0
H[X]=1 H[Z]=1 I[X; Z]=0 Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 25 Information Measures ... Information diagram for three random variables ... Markov chain: X Y Z ! ! Consequence of shielding: I[X; Z Y ]=0 | H(Y )
H(X) H[Z]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 26 Information Measures ... Information diagram for three random variables ... Markov chain: X Y Z ! ! Consequence of shielding: I[X; Z Y ]=0 | H(Y )
H[Y X, Z] |
H[X Y,Z] H[Z X, Y ] | | H(X) H[Z]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 26 Information Measures ... Information diagram for three random variables ... Markov chain: X Y Z ! ! Consequence of shielding: I[X; Z Y ]=0 | H(Y )
H[Y X, Z] |
I[X; Y Z] I[Y ; Z X] | |
H[X Y,Z] I[X; Z Y ] H[Z X, Y ] | | | H(X) H[Z]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 26 Information Measures ... Information diagram for three random variables ... Markov chain: X Y Z ! ! Consequence of shielding: I[X; Z Y ]=0 | H(Y )
H[Y X, Z] |
I[X; Y Z] I[Y ; Z X] | |
I[X; Y ; Z]
H[X Y,Z] I[X; Z Y ] H[Z X, Y ] | | | H(X) H[Z]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 26 Information Measures ... Information diagram for three random variables ... Markov chain: X Y Z ! ! Consequence of shielding: I[X; Z Y ]=0 |
H(Y )
H[Y X, Z] |
I[X; Y Z] I[Y ; Z X] | | I[X; Y ; Z]
H[X Y,Z] H[Z X, Y ] | | H(X) H[Z]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 27 Information Measures ... Information diagram for three random variables ... Markov chain: X Y Z ! ! Consequence of shielding: I[X; Z Y ]=0 | All areas positive All info measures positive H(Y ) (or zero) H[Y X, Z] |
I[X; Y Z] I[Y ; Z X] | | I[X; Y ; Z]
H[X Y,Z] H[Z X, Y ] | | H(X) H[Z]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 27 Information Measures ... Information diagram for three random variables ...
Markov chain: X Y Z ! ! H(Y )
H[Y X, Z] |
H(X) H[Z] H[X Y,Z] H[Z X, Y ] | | I[X; Y Z] I[Y ; Z X] | |
I[X; Y ; Z]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 28 Information Measures ... Process I-diagrams:
Process has an infinite number of RVs!
Pr( X! )=Pr(...,X 2,X 1,X0,X1,X2,...) Rather: Pr X! =Pr X !X ⇣ ⌘ ⇣ ⌘ Past as composite random variable: X Future as composite random variable: !X
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 29 Information Measures ... Process I-diagrams ...
Start with 2-variable I-diagram and whittle down:
Information measures: H[ X ] H[ !X] H[ !X, X ] H[ X !X] H[ !X X ] I[ !X; X ] H[ !X X ]+H[ X !X] | | | |
There are 3 = 22-1 atomic information measures: H[ !X X ] H[ X !X] I[ !X; X ] | |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 30 Information Measures ... Process I-diagrams ...
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 31 Information Measures ... Process I-diagrams ...
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 31 Information Measures ... Process I-diagrams ...
H[ !X]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 31 Information Measures ... Process I-diagrams ...
H[ !X]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 31 Information Measures ... Process I-diagrams ...
H[ X ] H[ !X]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 31 Information Measures ... Process I-diagrams ...
H[ X ] H[ !X X ] H[ !X] |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 31 Information Measures ... Process I-diagrams ...
H[ X ] H[ X !X] H[ !X X ] H[ !X] | |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 31 Information Measures ... Process I-diagrams ...
E = H[ X ] H[ X !X] I[X; X] H[ !X X ] H[ !X] | ! |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 31 Information Measures ... Process I-diagrams ...
What is H [ X !X ]+ H [ ! X X ]? | | Recall distance: d(X, Y )=H[X Y ]+H[Y X] | |
So, H[ !X X ]+H[ X !X] is the| distance between| the past and the future!
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 32 Information Measures ... Process I-diagrams ...
Nice picture (intuitive), but caution!
H[ X ] and H [ !X ] are infinite for positive entropy-rate processes.
Rather, work with finite-L quantities, e.g,: L L H[ X ] H[ !X ]
Then take limit: lim L !1
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 33 Information Measures ... Process I-diagrams ...
However, we do know that: L H[ !X X ]= lim H[ !X X ] | L | !1 More to the point: L H[ !X X ] h L | / µ More accurate foliated I-diagram ...
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 34 Information Measures ... Process I-diagrams ...
E = H[ X ] H[ X !X] I[X; X] H[ !X X ] H[ !X] | ! |
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 35 Information Measures ... Process I-diagrams ...
E = H[ X ] I[ X ; !X] H[ !X]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 35 Information Measures ... Process I-diagrams ...
E = L L H[ X ] H[ X !X] hµL I[X; X] H[ !X X ] hµL H[ !X] | / ! | /
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 35 Information Measures ... Process I-diagrams ...... hµ hµ hµ hµ
E = L L H[ X ] H[ X !X] hµL I[X; X] H[ !X X ] ...hµL H[ !X] | / ! | /
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 35 Information Measures ... The Shannon Information Measure ...
For N > 3 random variables:
Higher dimensions with hyperspheres or
Unsymmetric and/or disconnected blobs in the plane.
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 36 Information Measures ... The Shannon Information Measure ...
For N random variables:
~ 2N independent areas
N > 3 cannot be done with circles in the plane
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 37 Information Measures ... The Shannon Information Measure ... For N = 4 random variables: (X, Y, Z, W) Pr(x, y, z, w) Three circles + sausage in 2D ⇠ Y
W
X
Z
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 38 Information Measures ... The Shannon Information Measure ... For N = 4 random variables: Ellipses in 2D Y Z
X W
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 39 Information Measures ... The Shannon Information Measure ... For N = 4 random variables: Spheres in 3D
Y
Z X
W
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 40 Information Measures ... Multivariate Mutual Information(s): (X ,X ,...,X ) Pr(x ,x ,...,x ) 1 2 N ⇠ 1 2 N What is N-way mutual information?
At least three different generalizations of 2-way.
1. Shared information to which all variables contribute: I[X; Y ]=H[X]+H[Y ] H[X, Y ] 2. MI as relative entropy between joint and product of its marginals: I[X; Y ]= (Pr(x, y) Pr(x)Pr(y)) D || 3. Joint entropy minus all (single-variable) unshared information: I[X; Y ]=H[X, Y ] H[X Y ] H[Y X] | | Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 41 Information Measures ... Multivariate Mutual Information(s) ... Notation:
X0:N = X1,X2,...,XN 1 { } Universal set over the variable indices: ⌦ = 0, 1,...,N 1 N { } Power set: P (N)= ⌦ P N Complement of A P ( N ) : 2 A¯ = ⌦ A N \ Index set: i A 1,...,N 2 ✓ { } X = X : i A A { i 2 }
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 42 Information Measures ... Multivariate Mutual Information(s) ... Joint entropy:
H[X ]= Pr(x ) log Pr(x ) 0:N 0:N 2 0:N x { X0:N }
X2 X Y X3 X 1 X4 W Z
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 43 Information Measures ... Multivariate Mutual Information(s) ... Joint entropy:
H[X ]= Pr(x ) log Pr(x ) 0:N 0:N 2 0:N x { X0:N }
X2 X Y X3 X 1 1 1 X4 1 1 1 W 1 1 1 1 Z 1 1 1 1 1 1
Gray level = # times atom counted
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 43 Information Measures ... Multivariate Mutual Information(s) ... Multivariate mutual information (aka Co-Information):
I[X0:N ]=I[X1; X2; ...; XN ] A = ( 1)| |H[X ] A A P (N) 2X Add and subtract all subset entropies:
I[X0; X1; X2]=H[X0]+H[X1]+H[X2] H[X ,X ] H[X ,X ] H[X ,X ] 0 1 0 2 1 2 + H[X0,X1,X2]
Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield
Monday, March 10, 14 44 Information Measures ... Multivariate Mutual Information(s) ... Multivariate mutual information ...
I[X ]=H[X ] I[X X ¯] 0:N 0:N A| A A P (N) 0<2XA where, e.g.: I[X 1,3,4 X 0,2 ]=I[X1; X3; X4 X0,X2] { }| { } | I[X 1 X 0,2 ]=H[X1 X0,X2] { }| { } | Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 45 Information Measures ... X2 X3 Multivariate Mutual Information(s) ... X1 X4 Multivariate mutual information ... Properties: 1. Common information to which all variables contribute. 2. Can be negative. 3. I [ X 0: N ]=0 , if any two variables are completely independent (pairwise independent conditioned on any subset) Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 46 Information Measures ... Multivariate Mutual Information(s) ... Total correlation (aka Multi-information): Pr(x0:N ) T [X0:N ]= Pr(x0:N ) log2 Pr(x0) ...Pr(xN ) X ✓ ◆ = H[X ] H[X ] A 0:N A P (N) 2XA =1 | | X2 X3 X1 X4 Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 47 Information Measures ... Multivariate Mutual Information(s) ... X2 X3 X1 X4 1 1 Total correlation ... 1 2 2 Properties: 1 3 1 2 2 1. T [X ] 0 1 0:N 2. X X ,...X T [X ]=T [X ] 0 ? 1 N ) 0:N 1:N 3. Compares individuals to the entire set. 4. Includes redundant correlations. Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 48 Information Measures ... X2 X3 Multivariate Mutual Information(s) ... X1 X4 Bound Information: B[X ]=H[X ] H[X X ¯] 0:N 0:N A| A A P (N) 2XA =1 | | MI as joint entropy minus all (single-variable) unshared information. Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 49 Information Measures ... X2 X3 Multivariate Mutual Information(s) ... X1 X4 Bound Information ... Properties: 1. B[X ] 0 0:N 2. X X ,...X B[X ]=B[X ] 0 ? 1 N ) 0:N 1:N Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 50 Information Measures ... Multivariate Informations: Residual Entropy (Anti-Mutual Information): X X R[X ]=H[X ] B[X ] 2 3 0:N 0:N 0:N X1 X4 = H[X X ¯] A| A A P (N) 2XA =1 | | Information in individual variables that is not shared in any way. Total randomness localized to an individual variable and so not correlated to that in its peers. Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 51 Information Measures ... Multivariate Informations: Local Exogenous Information (Very Mutual Information): X2 X3 W [X0:N ]=B[X0:N ]+T [X0:N ] X1 X4 = I[XA; XA¯] A P (N) 2XA =1 | | Information in each variable that comes from its peers. Discounts for randomness produced locally. Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 52 Information Measures ... Reading for next lecture: CMR article Anatomy Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 53