Homework Set #1 Properties of Entropy, Mutual Information and Divergence

Homework Set #1 Properties of Entropy, Mutual Information and Divergence

Homework Set #1 Properties of Entropy, Mutual Information and Divergence 1. Entropy of functions of a random variable. Let X be a discrete random variable. Show that the entropy of a function of X is less than or equal to the entropy of X by justifying the following steps: (a) H(X; g(X)) = H(X) + H(g(X)jX) (b) = H(X): (c) H(X; g(X)) = H(g(X)) + H(Xjg(X)) (d) ≥ H(g(X)): Thus H(g(X)) ≤ H(X). Solution: Entropy of functions of a random variable. (a) H(X; g(X)) = H(X)+H(g(X)jX) by the chain rule for entropies. (b) H(g(X)jX) = 0 since for any particular value of X, g(X) is fixed, P P and hence H(g(X)jX) = x p(x)H(g(X)jX = x) = x 0 = 0. (c) H(X; g(X)) = H(g(X)) + H(Xjg(X)) again by the chain rule. (d) H(Xjg(X)) ≥ 0, with equality iff X is a function of g(X), i.e., g(:) is one-to-one. Hence H(X; g(X)) ≥ H(g(X)). Combining parts (b) and (d), we obtain H(X) ≥ H(g(X)). 2. Example of joint entropy. Let p(x; y) be given by Y X 0 1 1 1 0 3 3 1 1 0 3 Find 1 (a) H(X);H(Y ). (b) H(XjY );H(Y jX): (c) H(X; Y ). (d) H(Y ) − H(Y jX): (e) I(X; Y ): Solution: Example of joint entropy 2 3 1 (a) H(X) = 3 log 2 + 3 log 3 = :918 bits = H(Y ). 1 2 (b) H(XjY ) = 3 H(XjY = 0)+ 3 H(XjY = 1) = :667 bits = H(Y jX). 1 (c) H(X; Y ) = 3 × 3 log 3 = 1:585 bits. (d) H(Y ) − H(Y jX) = :251 bits. (e) I(X; Y ) = H(Y ) − H(Y jX) = :251 bits. 3. Bytes. P The entropy, Ha(X) = − p(x) loga p(x) is expressed in bits if the logarithm is to the base 2 and in bytes if the logarithm is to the base 256. What is the relationship of H2(X) to H256(X)? Solution: Bytes. i−1 X limi=1I(Yi; Y jQi) = 0H2(X) = − p(x) log2 p(x) X log p(x) log (2) = − p(x) 2 256 log256(2) (a) X log p(x) = − p(x) 256 log256(2) −1 X = p(x) log256 p(x) log256(2) (b) H (X) = 256 ; log256(2) where (a) comes from the property of logarithms and (b) follows from the definition of H256(X): Hence we get H2(X) = 8H256(X): 2 4. Two looks. Here is a statement about pairwise independence and joint indepen- dence. Let X; Y1; and Y2 be binary random variables. If I(X; Y1) = 0 and I(X; Y2) = 0, does it follow that I(X; Y1;Y2) = 0? (a) Yes or no? (b) Prove or provide a counterexample. (c) If I(X; Y1) = 0 and I(X; Y2) = 0 in the above problem, does it follow that I(Y1; Y2) = 0? In other words, if Y1 is independent of X, and if Y2 is independent of X, is it true that Y1 and Y2 are independent? Solution: Two looks. (a) The answer is \no". (b) Although at first the conjecture seems reasonable enough{after all, if Y1 gives you no information about X, and if Y2 gives you no information about X, then why should the two of them together give any information? But remember, it is NOT the case that I(X; Y1;Y2) = I(X; Y1)+I(X; Y2). The chain rule for information says instead that I(X; Y1;Y2) = I(X; Y1)+I(X; Y2jY1). The chain rule gives us reason to be skeptical about the conjecture. This problem is reminiscent of the well-known fact in probabil- ity that pair-wise independence of three random variables is not sufficient to guarantee that all three are mutually independent. I(X; Y1) = 0 is equivalent to saying that X and Y1 are indepen- dent. Similarly for X and Y2. But just because X is pairwise independent with each of Y1 and Y2, it does not follow that X is independent of the vector (Y1;Y2). Here is a simple counterexample. Let Y1 and Y2 be independent fair coin flips. And let X = Y1 XOR Y2. X is pairwise independent of both Y1 and Y2, but obviously not independent of the vector (Y1;Y2), since X is uniquely determined once you know (Y1;Y2). (c) Again the answer is \no". Y1 and Y2 can be arbitrarily dependent with each other and both still be independent of X. For example, let Y1 = Y2 be two observations of the same fair coin flip, and 3 X an independent fair coin flip. Then I(X; Y1) = I(X; Y2) = 0 because X is independent of both Y1 and Y2. However, I(Y1; Y2) = H(Y1) − H(Y1jY2) = H(Y1) = 1. 5. A measure of correlation. Let X1 and X2 be identically distributed, but not necessarily indepen- dent. Let H(X jX ) ρ = 1 − 1 2 : H(X1) (a) Show ρ = I(X1;X2) : H(X1) (b) Show 0 ≤ ρ ≤ 1: (c) When is ρ = 0? (d) When is ρ = 1? Solution: A measure of correlation. X1 and X2 are identically distributed and H(X jX ) ρ = 1 − 2 1 H(X1) (a) H(X ) − H(X jX ) ρ = 1 2 1 H(X1) H(X2) − H(X2jX1) = (since H(X1) = H(X2)) H(X1) I(X ; X ) = 1 2 : H(X1) (b) Since 0 ≤ H(X2jX1) ≤ H(X2) = H(X1), we have H(X jX ) 0 ≤ 2 1 ≤ 1 H(X1) 0 ≤ ρ ≤ 1: (c) ρ = 0 iff I(X1; X2) = 0 iff X1 and X2 are independent. 4 (d) ρ = 1 iff H(X2jX1) = 0 iff X2 is a function of X1. By symmetry, X1 is a function of X2, i.e., X1 and X2 have a one-to-one correspon- dence. For example, if X1 = X2 with probability 1 then ρ = 1. Similarly, if the distribution of Xi is symmetric then X1 = −X2 with probability 1 would also give ρ = 1. 6. The value of a question. Let X ∼ p(x); x = 1; 2; : : : ; m. We are given a set S ⊆ f1; 2; : : : ; mg. We ask whether X 2 S and receive the answer 1; if X 2 S Y = 0; if X 62 S: Suppose PrfX 2 Sg = α. (a) Find the decrease in uncertainty H(X) − H(XjY ). (b) Is it true that any set S with a given probability α is as good as any other. Solution: The value of a question. (a) Consider H(X) − H(XjY ) = H(Y ) − H(Y jX) = H(Y ) = Hb(α) (1) (b) Yes, since the answer depends only on α. 7. Relative entropy is not symmetric Let the random variable X have three possible outcomes fa; b; cg. Con- sider two distributions on this random variable Symbol p(x) q(x) a 1/2 1/3 b 1/4 1/3 c 1/4 1/3 Calculate H(p);H(q);D(p k q) and D(q k p). Verify that in this case D(p k q) 6= D(q k p). Solution: Relative entropy is not symmetric. 5 (a) H(p) = 1=2 log 2 + 2 × 1=4 log 4 = 1:5 bits. (b) H(q) = 3 × 1=3 log 3 = log 3 = 1:585 bits. (c) D(pkq) = 1=2 log 3=2+2×1=4 log 3=4 = log 3−3=2 = 0:0850 bits. (d) D(qkp) = 1=3 log 2=3+2×1=3 log 4=3 = 5=3−log 3 = 0:0817 bits. D(pkq) 6= D(qkp) as expected. 8. \True or False" questions Copy each relation and write true or false. Then, if it's true, prove it. If it is false give a counterexample or prove that the opposite is true. (a) H(X) ≥ H(XjY ) (b) H(X) + H(Y ) ≤ H(X; Y ) (c) Let X; Y be two independent random variables. Then H(X − Y ) ≥ H(X): (d) Let X; Y; Z be three random variables that satisfies H(X; Y ) = H(X) + H(Y ) and H(Y; Z) = H(Z) + H(Y ): Then the following holds H(X; Y; Z) = H(X) + H(Y ) + H(Z): (e) For any X; Y; Z and the deterministic function f; g I(X; Y jZ) = I(X; f(X; Y ); Y; g(Y; Z)jZ): Solution to \True or False" questions e. (a) H(X) ≥ H(XjY ) is true. Proof: In the class we showed that I(X; Y ) > 0, hence H(X) − H(XjY ) > 0. (b) H(X) + H(Y ) ≤ H(X; Y ) is false. Actually the opposite is true, i.e., H(X) + H(Y ) ≥ H(X; Y ) since I(X; Y ) = H(X) + H(Y ) − H(X; Y ) ≥ 0. (c) Let X; Y be two independent random variables. Then H(X − Y ) ≥ H(X): True (a) (b) H(X − Y ) ≥ H(X − Y jY )) ≥ H(X) 6 (a) follows from the fact that conditioning reduces entropy. (b) Follows from the fact that given Y , X − Y is a Bijective Func- tion. (d) Let X; Y; Z be three random variables that satisfies H(X; Y ) = H(X) + H(Y ) and H(Y; Z) = H(Z) + H(Y ). Then the following holds H(X; Y; Z) = H(X)+H(Y )+H(Z). This is false.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    25 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us