2.2 Shannon's Information Measures

2.2 Shannon's Information Measures

2.2 Shannon’s Information Measures Shannon’s Information Measures Entropy • Conditional entropy • Mutual information • Conditional mutual information • Shannon’s Information Measures Entropy • Conditional entropy • Mutual information • Conditional mutual information • Shannon’s Information Measures Entropy • Conditional entropy • Mutual information • Conditional mutual information • Shannon’s Information Measures Entropy • Conditional entropy • Mutual information • Conditional mutual information • Shannon’s Information Measures Entropy • Conditional entropy • Mutual information • Conditional mutual information • Definition 2.13 The entropy H(X) of a random variable X is defined as H(X)= p(x) log p(x). − x X Convention: summation is taken over . • SX When the base of the logarithm is ↵,writeH(X) as H (X). • ↵ Entropy measures the uncertainty of a discrete random variable. • The unit for entropy is • bit if ↵ =2 nat if ↵ = e D-it if ↵ = D A bit in information theory is di↵erent from a bit in computer science. • Definition 2.13 The entropy H(X) of a random variable X is defined as H(X)= p(x) log p(x). − x X Convention: summation is taken over . • SX When the base of the logarithm is ↵,writeH(X) as H (X). • ↵ Entropy measures the uncertainty of a discrete random variable. • The unit for entropy is • bit if ↵ =2 nat if ↵ = e D-it if ↵ = D A bit in information theory is di↵erent from a bit in computer science. • Definition 2.13 The entropy H(X) of a random variable X is defined as H(X)= p(x) log p(x). − x X Convention: summation is taken over . • SX When the base of the logarithm is ↵,writeH(X) as H (X). • ↵ Entropy measures the uncertainty of a discrete random variable. • The unit for entropy is • bit if ↵ =2 nat if ↵ = e D-it if ↵ = D A bit in information theory is di↵erent from a bit in computer science. • Definition 2.13 The entropy H(X) of a random variable X is defined as H(X)= p(x) log p(x). − x X Convention: summation is taken over . • SX When the base of the logarithm is __↵,writeH(X) as H (X). • ↵ Entropy measures the uncertainty of a discrete random variable. • The unit for entropy is • bit if ↵ =2 nat if ↵ = e D-it if ↵ = D A bit in information theory is di↵erent from a bit in computer science. • Definition 2.13 The entropy H(X) of a random variable X is defined as H(X)= p(x) log p(x). − x X Convention: summation is taken over . • SX When the base of the logarithm is __↵,writeH(X) as H (X). • ↵_ Entropy measures the uncertainty of a discrete random variable. • The unit for entropy is • bit if ↵ =2 nat if ↵ = e D-it if ↵ = D A bit in information theory is di↵erent from a bit in computer science. • Definition 2.13 The entropy H(X) of a random variable X is defined as H(X)= p(x) log p(x). − x X Convention: summation is taken over . • SX When the base of the logarithm is ↵,writeH(X) as H (X). • ↵ Entropy measures the uncertainty of a discrete random variable. • The unit for entropy is • bit if ↵ =2 nat if ↵ = e D-it if ↵ = D A bit in information theory is di↵erent from a bit in computer science. • Definition 2.13 The entropy H(X) of a random variable X is defined as H(X)= p(x) log p(x). − x X Convention: summation is taken over . • SX When the base of the logarithm is ↵,writeH(X) as H (X). • ↵ Entropy measures the uncertainty of a discrete random variable. • The unit for entropy is • bit if ↵ =2 nat if ↵ = e D-it if ↵ = D A bit in information theory is di↵erent from a bit in computer science. • Definition 2.13 The entropy H(X) of a random variable X is defined as H(X)= p(x) log p(x). − x X Convention: summation is taken over . • SX When the base of the logarithm is ↵,writeH(X) as H (X). • ↵ Entropy measures the uncertainty of a discrete random variable. • The unit for entropy is • bit if ↵ =2 nat if ↵ = e D-it if ↵ = D A bit in information theory is di↵erent from a bit in computer science. • Remark H(X) depends only on the distribution of X but not on the actual values taken by X, hence also write H(pX ). Example Let X and Y be random variables with = = 0, 1 , and let X Y { } pX (0) = 0.3,pX (1) = 0.7 and pY (0) = 0.7,pY (1) = 0.3. Although p = p , H(X)=H(Y ). X 6 Y Remark H(X) depends only on the distribution of X but not on the actual values taken by X, hence also write H(pX ). Example Let X and Y be random variables with = = 0, 1 , and let X Y { } pX (0) = 0.3,pX (1) = 0.7 and pY (0) = 0.7,pY (1) = 0.3. Although p = p , H(X)=H(Y ). X 6 Y Remark H(X) depends only on the distribution of X but not on the actual values taken by X, hence also write H(pX ). Example Let X and Y be random variables with = = 0, 1 , and let X Y { } ______________pX (0) = 0.3,pX (1) = 0.7 and pY (0) = 0.7,pY (1) = 0.3. Although p = p , H(X)=H(Y ). X 6 Y Remark H(X) depends only on the distribution of X but not on the actual values taken by X, hence also write H(pX ). Example Let X and Y be random variables with = = 0, 1 , and let X Y { } pX (0) = 0.3,p______________X (1) = 0.7 and pY (0) = 0.7,pY (1) = 0.3. Although p = p , H(X)=H(Y ). X 6 Y Remark H(X) depends only on the distribution of X but not on the actual values taken by X, hence also write H(pX ). Example Let X and Y be random variables with = = 0, 1 , and let X Y { } pX (0) = 0.3,pX (1) = 0.7 and ______________pY (0) = 0.7,pY (1) = 0.3. Although p = p , H(X)=H(Y ). X 6 Y Remark H(X) depends only on the distribution of X but not on the actual values taken by X, hence also write H(pX ). Example Let X and Y be random variables with = = 0, 1 , and let X Y { } pX (0) = 0.3,pX (1) = 0.7 and pY (0) = 0.7,p______________Y (1) = 0.3. Although p = p , H(X)=H(Y ). X 6 Y Remark H(X) depends only on the distribution of X but not on the actual values taken by X, hence also write H(pX ). Example Let X and Y be random variables with = = 0, 1 , and let X Y { } pX (0) = 0.3,pX (1) = 0.7 and pY (0) = 0.7,pY (1) = 0.3. Although p = p , H(X)=H(Y ). X 6 Y Entropy as Expectation Convention • Eg(X)= p(x)g(x) x X where summation is over . SX Linearity • E[f(X)+g(X)] = Ef(X)+Eg(X) See Problem 5 for details. Can write • H(X)= E log p(X)= p(x) log p(x) − − x X In probability theory, when Eg(X) is considered, usually g(x)depends • only on the value of x but not on p(x). Entropy as Expectation Convention • Eg(X)= p(x)g(x) x X where summation is over . SX Linearity • E[f(X)+g(X)] = Ef(X)+Eg(X) See Problem 5 for details. Can write • H(X)= E log p(X)= p(x) log p(x) − − x X In probability theory, when Eg(X) is considered, usually g(x)depends • only on the value of x but not on p(x). Entropy as Expectation Convention • Eg(X)= p(x)g(x) x X where summation is over . SX Linearity • E[f(X)+g(X)] = Ef(X)+Eg(X) See Problem 5 for details. Can write • H(X)= E log p(X)= p(x) log p(x) − − x X In probability theory, when Eg(X) is considered, usually g(x)depends • only on the value of x but not on p(x). Entropy as Expectation Convention • Eg(X)= p(x)g(x) x X where summation is over . SX Linearity • E[f(X)+g(X)] = Ef(X)+Eg(X) See Problem 5 for details. Can write • H(X)= E log p(X)= p(x) log p(x) − _________ − _________ x X In probability theory, when Eg(X) is considered, usually g(x)depends • only on the value of x but not on p(x). Entropy as Expectation Convention • Eg(X)= p(x)g(x) x X where summation is over . SX Linearity • E[f(X)+g(X)] = Ef(X)+Eg(X) See Problem 5 for details. Can write • H(X)= E log p(X)= p(x) log p(x) − _________ − _________ x X In probability theory, when Eg(X) is considered, usually g(x)depends • only on the value of x but not on p(x). Entropy as Expectation Convention • Eg(X)= p(x)g(x) x X where summation is over . SX Linearity • E[f(X)+g(X)] = Ef(X)+Eg(X) See Problem 5 for details. Can write • H(X)= E log p(X)= p(x) log p(x) − _________ − _________ x X In probability theory, when Eg(X) is considered, usually g(x)depends • only on the value of x but not on p(x). Binary Entropy Function For 0 γ 1, define the binary entropy function • h (γ)= γ log γ (1 γ) log(1 γ) b − − − − with the convention 0 log 0 = 0, as by L’Hopital’s rule, lim a log a =0.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    149 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us