
Chapter 3 More about Discrete Random Variables Wei-Yang Lin Department of Computer Science & Information Engineering mailto:[email protected] 1 ² 3.2 The binomial random variables ² 3.4 Conditional probability ² 3.5 Conditional expectation 2 ² In many problems, the quantity of interest can be expressed in the form Y = X1 + ¢ ¢ ¢ + Xn, where the Xi are independent Berboulli(p) random variables. ² The random variable Y is called a binomial(n; p) random variable. ² Its probability mass function is µ ¶ n p (k) = pk(1 ¡ p)n¡k; k = 0; : : : ; n: (1) Y k 3 Example 3.6 ² A sample of radioactive material is composed of n molecules. ² Each molecule has probability p of emitting an alpha particle, and the particles are emitted independently. ² Show that the number of particles emitted is a sum of independent Bernoulli random variables. 4 Solution ² Let Xi = 1 if the ith molecule emits an alpha particle, and Xi = 0 otherwise. ² Then the Xi are independent Bernoulli(p), and Y = X1 + ¢ ¢ ¢ + Xn counts the number of alpha particle emitted. 5 Homework ² Problems 11, 12, 13, 14. 6 ² 3.2 The binomial random variables ² 3.4 Conditional probability ² 3.5 Conditional expectation 7 Conditional probability mass functions ² For discrete random variables, we de¯ne the conditional probability mass functions, pXjY (xijyj) := P(X = xijY = yj) P(X = x ;Y = y ) p (x ; y ) = i j = XY i j ; (2) P(Y = yj) pY (yj) and pY jX (yjjxi) := P(Y = yjjX = xi) P(X = x ;Y = y ) p (x ; y ) = i j = XY i j : (3) P(X = xi) pX (xi) 8 Example 3.11 ² To transmit message i using an optical communication system. ² When light of intensity ¸i strikes the photodetector, the number of photoelectrons generated is a Poisson(¸i) random variable. ² Find the conditional probability that the number of photoelectrons observed at the photodetector is less than 2 given that message i was sent. 9 Solution ² Let X denote the message to be sent, and let Y denote the number of photoelectrons generated by the photodetector. ² The problem statement is telling us that ¸ ne¡¸i P(Y = njX = i) = i ; n = 0; 1; 2;::: n! ² The conditional probability to be calculated is P(Y < 2jX = i) = P(Y = 0 or Y = 1jX = i) = P(Y = 0jX = i) + P(Y = 1jX = i) ¡¸i ¡¸i = e + ¸ie : 10 The law of total probability ² If Y is a discrete random variable taking the value yj, then X P(Y = yj) = P(Y = yjjX = xi)P(X = xi) i X = pY jX (yjjxi)pX (xi): (4) i 11 Example 3.15 ² Radioactive samples give o® alpha-particles at a rate based on the size of the sample. ² For a sample of size k, suppose that the number of particles observed is a Poisson random variable Y with parameter k. ² If the sample size is a geometric1(p) random variable X, ¯nd P(Y = 0) and P(X = 1jY = 0). 12 Solution (1/3) ² The problem statement is telling us that P(Y = njX = k) is the Possion pmf with parameter k. kne¡k P(Y = njX = k) = ; n = 0; 1;::: n! ² In particular, note that P(Y = 0jX = k) = e¡k. 13 Solution (2/3) ² Now use the law of total probability to write X1 P(Y = 0) = P(Y = 0jX = k) ¢ P(X = k) k=1 X1 = e¡k ¢ (1 ¡ p)pk¡1 k=1 1 1 ¡ p X ³p´k = p e k=1 1 ¡ p p=e 1 ¡ p = = : p 1 ¡ p=e e ¡ p 14 Solution (3/3) ² Next, P(X = 1;Y = 0) P(X = 1jY = 0) = P(Y = 0) P(Y = 0jX = 1)P(X = 1) = P(Y = 0) e ¡ p = e¡1 ¢ (1 ¡ p) ¢ 1 ¡ p p = 1 ¡ : e 15 The substitution law ² It is often that Z is a function of X and Y , say Z = g(X; Y ), and we are interested in P(Z = z). X P(Z = z) = P(Z = zjX = xi)P(X = xi) Xi = P(g(X; Y ) = zjX = xi)P(X = xi): i ² We claim that P(g(X; Y ) = zjX = xi) = P(g(xi;Y ) = zjX = xi): (5) ² This property is known as the substitution law of conditional probability. 16 Example 3.17 ² A random, integer-valued signal X is transmitted over a channel subject to independent, additive, integer-valued noise Y . ² The received signal is Z = X + Y . ² To estimate X based on the received value Z, the system designer wants to use the conditional pmf pXjZ . ² Find the desired conditional pmf. 17 Solution (1/3) ² Let X and Y be independent, discrete, integer-valued random variable with pmfs pX and pY , respectively. PXjZ (ijj) = P(X = ijZ = j) P(X = i; Z = j) = P(Z = j) P(Z = jjX = i)P(X = i) = P(Z = j) P(Z = jjX = i)p (i) = X P(Z = j) 18 Solution (2/3) ² Use the substitution law followed by independence to write P(Z = jjX = i) = P(X + Y = jjX = i) = P(i + Y = jjX = i) = P(Y = j ¡ ijX = i) P(Y = j ¡ i; X = i) = P(X = i) P(Y = j ¡ i)P(X = i) = P(X = i) = P(Y = j ¡ i) = pY (j ¡ i): 19 Solution (3/3) ² Use the law of total probability to compute X X pZ (j) = P(Z = jjX = i)P(X = i) = pY (j ¡ i)pX (i): i i ² It follows that pY (j ¡ i)pX (i) pXjZ (ijj) = P : k pY (j ¡ k)pX (k) 20 Homework ² Problems 23, 24, 27, 30, 31. 21 ² 3.2 The binomial random variables ² 3.4 Conditional probability ² 3.5 Conditional expectation 22 Conditional expectation ² Just as we develop expectation for discrete random variables, we can develop conditional expectation in the same way. ² This leads to the formula X E[g[Y ]jX = xi] = g(yj)pY jX (yjjxi): (6) j 23 Example 3.19 ² The random number Y of alpha particles emitted by a radioactive sample is conditionally Poisson(k) given that the sample size X = k. ² Find E[Y jX = k]. 24 Solution ² We must compute X E[Y jX = k] = nP(Y = njX = k); n where kne¡k P(Y = njX = k) = ; n = 0; 1;::: n! ² Hence, X1 kne¡k E[Y jX = k] = n = k: n! n=0 25 Substitution law for conditional expectation ² We call E[g(X; Y )jX = xi] = E[g(xi;Y )jX = xi] (7) the substitution law for conditional expectation. 26 Law of total probability for expectation ² In Section 3.4 we discuss the law of total probability, which shows how to compute probabilities in terms of conditional probabilities. 27 ² We now derive the analogous formula for expectation. X E[g(X; Y )jX = xi]pX (xi) i X = E[g(xi;Y )jX = xi]pX (xi) i X h X i = g(xi; yj)pY jX (yjjxi) pX (xi) i j X X = g(xi; yj)pXY (xi; yj) i j = E[g(X; Y )] 28 ² Hence, the law of total probability for expectation is X E[g(X; Y )] = E[g(X; Y )jX = xi]pX (xi): (8) i 29 Homework ² Problems 40, 41, 42. 30.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages30 Page
-
File Size-