Lecture 21: Conditional Distributions and Covariance / Correlation
Total Page:16
File Type:pdf, Size:1020Kb
6.3, 6.4 Conditional Distributions Conditional Probability / Distributions Let X and Y be random variables then Lecture 21: Conditional Distributions and Covariance / Conditional probability: Correlation P(X = x; Y = y) P(X = xjY = y) = P(Y = y) Statistics 104 f (x; y) f (xjy) = Colin Rundel fY (y) Product rule: April 9, 2012 P(X = x; Y = y) = P(X = xjY = y)P(Y = y) f (x; y) = f (xjy)fY (y) Statistics 104 (Colin Rundel) Lecture 21 April 9, 2012 1 / 23 6.3, 6.4 Conditional Distributions 6.3, 6.4 Conditional Distributions Bayes' Theorem and the Law of Total Probability Example - Conditional Uniforms Let X and Y be random variables then Let X ∼ Unif(0; 1) and Y jX ∼ Unif(x; 1), find f (x; y), fY (y), and f (xjy). Bayes' Theorem: f (x; y) f (yjx)f (x) f (xjy) = = X fY (y) fY (y) f (x; y) f (xjy)f (y) f (yjx) = = Y fX (x) fX (x) Law of Total Probability: Z 1 Z 1 fX (x) = f (x; y) dy = f (xjy)fY (y) dy −∞ −∞ Z 1 Z 1 fY (y) = f (x; y) dx = f (yjx)fX (x) dx −∞ −∞ Statistics 104 (Colin Rundel) Lecture 21 April 9, 2012 2 / 23 Statistics 104 (Colin Rundel) Lecture 21 April 9, 2012 3 / 23 6.3, 6.4 Conditional Distributions 6.3, 6.4 Conditional Expectation Independence and Conditional Distributions Conditional Expectation If X and Y are independent random variables then If X and Y are independent random variables then we define the conditional expectation as follows f (xjy) = f (x) 4.7 Conditional Expectation X 257 X The value of E(Y x) will not be uniquely defined for those values of x such that E(X jY = y) = xf (xjy) dx f (they marginaljx) = p.f.fY or( p.d.f.y| ) of X satisfies f (x) 0. However, since these values of x 1 = all y 256 Chapter 4 Expectation form a set of points whose probability is 0, the definition of E(Y x) at such a point | is irrelevant. (See Exercise 11 in Sec. 3.6.) It is also possible that there will be some Z 1 4.7 Conditional Expectation values of x such that the mean of the conditional distribution of Y given X x is Justification for this is straightundefined forward: for those x values. When the mean of Y exists and is finite, the set= of x E(X jY = y) = xf (xjy) dx Since expectations (including variances and covariances) are propertiesvalues forof distri- which the conditional mean is undefined has probability 0. −∞ butions, there will exist conditional versions of all such distributionalThe summaries expressions in Eqs. (4.7.1) and (4.7.2) are functions of x. These functions of as well as conditional versions of all theorems that we havef proven(x;cany or) be will computed laterfX (x)f beforeY (y)X is observed, and this idea leads to the following useful prove about expectations. In particular, suppose that we wish to predict one ran- f (xjy) = concept.= = fX (x) dom variable Y using a function d(X) of another random variablefY (y)X so as to fY (y) minimize E([Y d(X)]2). Then d(X) should be the conditional mean of Y given X. There is also− a very useful theorem that is anDefinition extension to expectationsConditional of Meansthe as Random Variables. Let h(x) stand for the function of x that is law of total probability. 4.7.2 denoted E(Y x) in either (4.7.1) or (4.7.2). Define the symbol E(Y X) to mean h(X) and call it the| conditional mean of Y given X. | Definition and Basic Properties In other words, E(Y X) is a random variable (a function of X) whose value when X x is E(Y x). Obviously,| we could define E(X Y)and E(X y) analogously. Example Household Survey. A collection of households were surveyed, and each= household| re- | | 4.7.1 ported the number of members and the number of automobiles owned. The reported numbers are in Table 4.1. Example Household Survey. Consider the household survey in Example 4.7.1. Let X be the Suppose that we were to sample a household at4.7.2 random fromnumber those households of members in a randomly selected household from the survey, and let Y be in the survey and learn the number of members. What would thenthe numberbe the expected of cars owned by that household. The 250 surveyed households are all number of automobiles that they own? equally likely to! be selected, so Pr(X x, Y y) is the number of households with = = The question at the end of Example 4.7.1 is closely relatedx members to the conditional and y cars, divided by 250. Those probabilities are reported in Table 4.2. Suppose that the sampled household has X 4 members. The conditional p.f. of Y distributionStatistics of one 104 random (Colin variable Rundel) given the other, as defined in Sec.Lecture 3.6. 21 April 9, 2012 4 / 23 Statistics 104 (Colin Rundel) Lecture 21 April 9, 2012 5 / 23 given X 4 is g (y 4) f(4, y)/f (4), which= is the x 4 column of Table 4.2 divided = 2 | = 1 = Definition Conditional Expectation/Mean. Let X and Y be random variablesby suchf1( that4) the0.208, mean namely, 4.7.1 of Y exists and is finite. The conditional expectation (or conditional mean)= of Y given g (0 4) 0.0385,g(1 4) 0.5769,g(2 4) 0.2885,g(3 4) 0.0962. X x is denoted by E(Y x) and is defined to be the expectation of the2 | conditional= 2 | = 2 | = 2 | = distribution= of Y given X| x. = The conditional mean of Y given X 4 is then = For example, if Y has a continuous conditional distribution6.3, given 6.4X xConditionalwith Expectation 6.3, 6.4 Conditional Expectation E(Y= 4) 0 0.0385 1 0.5769 2 0.2885 3 0.0962 1.442. conditional p.d.f. g2(y x), then | = × + × + × + × = | Similarly, we can compute E(Y x) for all eight values of x. They are E(Y x) ∞ yg (y x) dy. (4.7.1) | Example - Family| = 2 Cars| (Example 4.7.2 deGroot) Conditional Expectation as a Random Variable !−∞ x 12345678 Similarly, if Y has a discrete conditional distribution given X x with conditional p.f. g (y x), then = 2 | E(Y x) 0.609 1.057 1.317 1.442 1.538 1.533 1.75 2 Let X be the numberE(Y x) of membersyg (y x). in a randomly| (4.7.2) selected household and Y be the number of | = 2 | Based on the previous example we can see that the value of E(Y jX ) cars owned by that household.All y " changes depending on the value of x. Table 4.2 Joint p.f. f (x, y) of X and Y in Example 4.7.2 together with marginal Table 4.1 Reported numbers of household members and p.f.’s f1(x) and f2(y) automobiles in Example 4.7.1 As such we can think of the conditional expectation as being a function of x Number of members Number of the random variable X , thereby making E(Y jX ) itself a random variable, y 12345678f2(y) automobiles 12345678 0 0.040 0.028 0.012 0.008 0.008 0.004 0 0 0.100 which can be manipulated like any other random variable. 0 10 7 3 2 2 1 0 0 1 0.048 0.084 0.100 0.120 0.100 0.060 0.020 0.004 0.536 1 12 21 25 30 25 15 5 1 Law of Total Probability for Expectations: 2 1 5 10 15 20 11 5 3 2 0.004 0.020 0.040 0.060 0.080 0.044 0.020 0.012 0.280 302355321 3 0 0.008 0.012 0.020 0.020 0.012 0.008 0.004 0.084 f1(x) 0.092 0.140 0.164 0.208 0.208 0.120 0.048 0.020 Z 1 Z 1 Z 1 E[E(Y jX )] = E(Y jX = x)fX (x) dx = yf (yjx)fX (x) dy dx We can then calculate the expected number of cars in a household with 4 members as follows −∞ −∞ −∞ Z 1 Z 1 Z 1 Z 1 4 4 f (x; y) X X = y fX (x) dy dx = yf (x; y) dy dx E(Y jX = 4) = y · P(Y = yjX = 4) = y · P(X = 4; Y = y)=P(X = 4) −∞ −∞ fX (x) −∞ −∞ y=0 y=0 Z 1 Z 1 = y f (x; y) dx dy = 0 × 0:0385 + 1 × 0:5769 + 2 × 0:2885 + 3 × 0:0962 = 1:442 −∞ −∞ Similarly, we can calulcate E(Y jX ) for the other seven values of x Z 1 = yfY (y) dy = E(Y ) x 1 2 3 4 5 6 7 8 −∞ E(Y jX = x) 0.609 1.057 1.317 1.442 1.538 1.533 1.75 2 Statistics 104 (Colin Rundel) Lecture 21 April 9, 2012 6 / 23 Statistics 104 (Colin Rundel) Lecture 21 April 9, 2012 7 / 23 6.3, 6.4 Conditional Expectation 6.3, 6.4 Conditional Expectation Example - Family Cars, cont. (Example 4.7.2 deGroot) Example - Conditional Uniforms, cont. We can confirm the Law of Total Probability for Expectations using the Let X ∼ Unif(0; 1) and Y jX ∼ Unif(x; 1), find E(Y ).