Information Measures & Information Diagrams

Information Measures & Information Diagrams

Information Measures & Information Diagrams Reading for this lecture: CMR articles Yeung & Anatomy Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 1 Information Measures ... One random variable: X Pr(x) ⇠ Use paradigm: One shot sampling. Recall information theory quantity: Entropy: H[X] Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 2 Information Measures ... Two random variables: X Pr(x) ⇠ Y Pr(y) ⇠ (X, Y ) Pr(x, y) ⇠ Recall information theory quantities: Joint entropy: H[X, Y ] Conditional Entropies: H[X Y ] H[Y X] Mutual Information: I[X|; Y ] | Information Metric: d(X, Y ) Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 3 Information Measures ... Event Space Relationships of Information Quantifiers: Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers: Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers: H(X) Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers: H(X) Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers: H(Y ) H(X) Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers: H(X, Y ) H(Y ) H(X) Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers: H(X, Y ) H(Y ) H(X Y ) H(X) | Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers: H(X, Y ) H(Y ) H(Y X) H(X Y ) H(X) | | Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers: H(X, Y ) H(Y ) H(Y X) I(X; Y ) H(X Y ) H(X) | | Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 4 Information Measures ... Event Space Relationships of Information Quantifiers: H(X, Y ) H(Y ) H(Y X) I(X; Y ) H(X Y ) H(X) | | d(X; Y ) Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 4 Information Measures ... Just a cute mnemonic? Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 5 Information Measures ... Just a cute mnemonic? No, much more structure there. Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 5 Information Measures ... Three random variables: X Pr(x) ⇠ Y Pr(y) ⇠ Z Pr(z) (X, Y, Z) ⇠ Pr(x, y, z) ⇠ Information theory quantities: Joint entropy: H[X, Y, Z] Conditional entropies: Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 6 Information Measures ... Three random variables: X Pr(x) ⇠ Y Pr(y) ⇠ Z Pr(z) (X, Y, Z) ⇠ Pr(x, y, z) ⇠ Information theory quantities: Joint entropy: H[X, Y, Z] Conditional entropies: H[X Y,Z] H[Y X, Z] H[Z X, Y ] H[X,| Y Z] H[X,| Z Y ] H[Y,Z| X] | | | Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 6 Information Measures ... Three random variables: X Pr(x) ⇠ Y Pr(y) ⇠ Z Pr(z) (X, Y, Z) ⇠ Pr(x, y, z) ⇠ Information theory quantities: Joint entropy: H[X, Y, Z] Conditional entropies: H[X Y,Z] H[Y X, Z] H[Z X, Y ] H[X,| Y Z] H[X,| Z Y ] H[Y,Z| X] | | | Conditional mutual information: I[X; Y Z] | Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 6 Information Measures ... Three random variables: X Pr(x) ⇠ Y Pr(y) ⇠ Z Pr(z) (X, Y, Z) ⇠ Pr(x, y, z) ⇠ Information theory quantities: Joint entropy: H[X, Y, Z] Conditional entropies: H[X Y,Z] H[Y X, Z] H[Z X, Y ] H[X,| Y Z] H[X,| Z Y ] H[Y,Z| X] | | | Conditional mutual information: I[X; Y Z] | Mutual information? I [ X ; Y ; Z ] ? Information metric? Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 6 Information Measures ... Three random variables: X Pr(x) ⇠ Y Pr(y) ⇠ Z Pr(z) (X, Y, Z) ⇠ Pr(x, y, z) ⇠ Information theory quantities: Joint entropy: H[X, Y, Z] Conditional entropies: H[X Y,Z] H[Y X, Z] H[Z X, Y ] H[X,| Y Z] H[X,| Z Y ] H[Y,Z| X] | | | Conditional mutual information: I[X; Y Z] | Mutual information? I [ X ; Y ; Z ] ? Information metric? How to visualize? A roadmap for information relationships? Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 6 Information Measures ... N random variables? (X ,X ,...,X ) Pr(x ,x ,...,x ) 1 2 N ⇠ 1 2 N Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 7 Information Measures ... N random variables? (X ,X ,...,X ) Pr(x ,x ,...,x ) 1 2 N ⇠ 1 2 N Agenda: Mechanics of information measures for N variables Graphical information diagram for processes Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 7 Information Measures ... The Shannon Information Measures: Entropy, conditional entropy, mutual information, and conditional mutual information All information measures can be expressed as linear combination of entropies. E.g., H[X Y ]=H[X] H[X, Y ] | − I[X; Y ]=H[X]+H[Y ] H[X, Y ] − d(X, Y )=H[X Y ]+H[Y X] | | Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 8 Information Measures ... The Shannon Information Measures ... No fundamental difference btw entropy H & mutual information I . In fact, entropy can be introduced as a “self-mutual information”: H[X]=I[X; X] So, there’s really a single quantity being referenced. Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 9 Information Measures ... The Shannon Information Measures ... I-Measure: I and H form a signed measure over event space. Set variables for event space: X corresponds to random variable X Y corresponds to random variable Y Universale set: ⌦=X Y e [ Over σ-field of atoms: e e = (X Y ), X,Y,(X Y ), (X Y c), (Xc Y ), (X Y )c, F [ \ \ \ \ ; n o Real-valuede measuree e e for eatoms:e µe⇤ eR e e e e 2 Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 10 Information Measures ... The Shannon Information Measures ... I-Measure ... defined By the mappings: µ⇤(X Y )=H[X, Y ] [ µ⇤(X)=H[X] e e µ⇤(Y )=H[Y ] e µ⇤(X Y )=I[X; Y ] \ e µ⇤(X Y )=H[X Y ] e − e | µ⇤(Y X)=H[Y X] − | e ec µ⇤((X Y ) )=H[X Y ]+H[Y X] e\ e | | µ⇤( )=0 e e ; Where: X Y = X Y c − \ Y X = Xc Y Lecture 19: Natural Computation & eSelf-Organization,− e Physics 256Be (Winter\e 2014); Jim Crutchfield Monday, March 10, 14 11 e e e e Information Measures ... The Shannon Information Measures: I-Measure ... Roadmap: Information measures to set-theoretic operations H and I µ⇤ ! , ![ ; !\ |!− Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 12 Information Measures ... The Shannon Information Measures: I-Measure ... For example, I[X; Y ]=H[X]+H[Y ] H[X, Y ] or − µ⇤(X Y )=µ⇤(X)+µ⇤(Y ) µ⇤(X Y ) \ − [ map to the set identity e e e e e e X Y = X Y (X Y ) (Notation ambiguous.) \ [ − [ That is, e e e e e e = + − Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 13 Information Measures ... The Shannon Information Measures: I-Measure ... Benefits: Extends to N variables Makes explicit underlying structure of information theory Graphical “calculus” of information identities Often easier to work with, sometimes not Often easier to discover new quantities: e.g., N-variable mutual informations ... stay tuned Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 14 Information Measures ... The Shannon Information Measures: I-Measure for N variables: Definition: = σ-field generated by W = Xi : i =1,...,N F { } (Drop tildes on set variables!) N Universal set: ⌦= Xi i [ N Atoms: A : A = Y ,Y= X or Xc 2F i i i i i=1 \ A =2N 1 (Max number of atoms) || || − N 2 1 (Max number of sets of atoms) =2 − ||F|| Lecture 19: Natural Computation & Self-Organization, Physics 256B (Winter 2014); Jim Crutchfield Monday, March 10, 14 15 Information Measures ... The Shannon Information Measures: Theorem: I-Measure for N random variables: G, G0,G00 W = X ,i=1,...N 2 { i } µ⇤ X H[X, X G] ⌘ 2 X G ! [2 µ⇤ X Y H[X, X G Y,Y G0] − ⌘ 2 | 2 X G ! Y G !! [2 [2 0 µ⇤ X Y I[X, X G; Y,Y G0] ⌘ 2 2 X G ! Y G !! [2 \ [2 0 µ⇤ X Y Z I[X, X G; Y,Y G0 Z, Z G00] − ⌘ 2 2 | 2 X G ! Y G ! Z G !! [2 \ [2 0 [2 00 Unique measure on consistent with information measures.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    97 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us