Chapter 5: Multivariate Distributions
Total Page:16
File Type:pdf, Size:1020Kb
Chapter 5: Multivariate Distributions Professor Ron Fricker Naval Postgraduate School Monterey, California 3/15/15 Reading Assignment: Sections 5.1 – 5.12 1 Goals for this Chapter • Bivariate and multivariate probability distributions – Bivariate normal distribution – Multinomial distribution • Marginal and conditional distributions • Independence, covariance and correlation • Expected value & variance of a function of r.v.s – Linear functions of r.v.s in particular • Conditional expectation and variance 3/15/15 2 Section 5.1: Bivariate and Multivariate Distributions • A bivariate distribution is a probability distribution on two random variables – I.e., it gives the probability on the simultaneous outcome of the random variables • For example, when playing craps a gambler might want to know the probability that in the simultaneous roll of two dice each comes up 1 • Another example: In an air attack on a bunker, the probability the bunker is not hardened and does not have SAM protection is of interest • A multivariate distribution is a probability distribution for more than two r.v.s 3/15/15 3 Joint Probabilities • Bivariate and multivariate distributions are joint probabilities – the probability that two or more events occur – It’s the probability of the intersection of n 2 ≥ events: Y = y , Y = y ,..., Y = y { 1 1} { 2 2} { n n} – We’ll denote the joint (discrete) probabilities as P (Y1 = y1,Y2 = y2,...,Yn = yn) • We’ll sometimes use the shorthand notation p(y1,y2,...,yn) – This is the probability that the event Y = y and { 1 1} the event Y 2 = y 2 and … and the event Y n = y n all occurred{ simultaneously} { } 3/15/15 4 Section 5.2: Bivariate and Multivariate Distributions • Simple example of a bivariate distribution arises when throwing two dice – Let Y1 be the number of spots on die 1 – Let Y2 be the number of spots on die 2 • Then there are 36 possible joint outcomes (remember the mn rule: 6 x 6 = 36) – The outcomes (y1, y2) are (1,1), (1,2), (1,3),…, (6,6) • Assuming the dice are independent, all the outcomes are equally likely, so p(y1,y2)=P (Y1 = y1,Y2 = y2)=1/36, y1 =1, 2,...,6,y2 =1, 2,...,6 3/15/15 5 Bivariate PMF for the Example 3/15/15 6 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7th edition, Thomson Brooks/Cole. Defining Joint Probability Functions (for Discrete Random Variables) • Definition 5.1: Let Y1 and Y2 be discrete random variables. The joint (or bivariate) probability (mass) function for Y1 and Y2 is P (Y = y ,Y = y ), <y < , <y < 1 1 2 2 1 1 1 1 2 1 • Theorem 5.1: If Y1 and Y2 are discrete r.v.s with joint probability function p ( y 1 ,y 2 ) , then 1. p (y ,y ) 0 for all y ,y 1 2 ≥ 1 2 2. p ( y 1 ,y 2 )=1 , where the sum is over all y ,y non-zero p(y , y ) X1 2 1 2 3/15/15 7 Back to the Dice Example • Find P (2 Y 3 , 1 Y 2) . 1 2 • Solution: 3/15/15 8 Textbook Example 5.1 • A NEX has 3 checkout counters. – Two customers arrive independently at the counters at different times when no other customers are present. – Let Y1 denote the number (of the two) who choose counter 1 and let Y2 be the number who choose counter 2. – Find the joint distribution of Y1 and Y2. • Solution: 3/15/15 9 Textbook Example 5.1 Solution Continued 3/15/15 10 Joint Distribution Functions • Definition 5.2: For any random variables (discrete or continuous) Y1 and Y2, the joint (bivariate) (cumulative) distribution function is F (y ,y )=P (Y y ,Y y ), 1 2 1 1 2 2 <y < , <y < , 1 1 1 1 2 1 • For two discrete r.v.s Y1 and Y2, F (y1,y2)= p(t1,t2) t y t y 1X 1 2X 2 3/15/15 11 Back to the Dice Example • Find F (2 , 3) = P ( Y 2 ,Y 3) . 1 2 • Solution: 3/15/15 12 Textbook Example 5.2 • Back to Example 5.1. Find F ( 1 , 2) ,F (1 . 5 , 2) and F (5 , 7) . − • Solution: 3/15/15 13 Properties of Joint CDFs • Theorem 5.2: If Y1 and Y2 are random variables with joint cdf F(y1, y2), then 1. F ( , )=F (y , )=F ( ,y )=0 1 1 1 1 1 2 2. F ( , )=1 1 1 3. If y ⇤ y 1 and y ⇤ y , then 1 ≥ 2 ≥ 2 F (y⇤,y⇤) F (y⇤,y ) F (y ,y⇤)+F (y ,y ) 0 1 2 − 1 2 − 1 2 1 2 ≥ Property 3 follows because F (y⇤,y⇤) F (y⇤,y ) F (y ,y⇤)+F (y ,y ) 1 2 − 1 2 − 1 2 1 2 = P (y Y y⇤,y Y y⇤) 0 1 1 1 2 2 2 ≥ 3/15/15 14 Joint Probability Density Functions • Definition 5.3: Let Y1 and Y2 be continuous random variables with joint distribution function F(y1, y2). If there exists a non- negative function f (y1, y2) such that y1 y2 F (y1,y2)= f(t1,t2)dt1dt2 t = t = Z 1 1 Z 2 1 for all <y < , <y < , then Y 1 1 1 1 2 1 1 and Y2 are said to be jointly continuous random variables. The function f (y1, y2) is called the joint probability density function. 3/15/15 15 Properties of Joint PDFs • Theorem 5.3: If Y1 and Y2 are random variables with joint pdf f (y1, y2), then 1. f ( y ,y ) 0 for all y , y 1 2 ≥ 1 2 f (y1, y2) 1 1 2. f ( y 1 ,y 2 ) dy 1 dy 2 =1 Z1 Z1 • An illustrative joint pdf, which is a surface in y2 3 dimensions: 3/15/15 y1 16 Volumes Correspond to Probabilities • For jointly continuous random variables, volumes under the pdf surface correspond to probabilities • E.g., for the pdf in Figure 5.2, the probability P ( a 1 Y 1 a 2 ,b 1 Y 2 b 2 ) corresponds to the volume shown • It’s equal to b2 a2 f(y1,y2)dy1dy2 Zb1 Za1 3/15/15 17 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7th edition, Thomson Brooks/Cole. Textbook Example 5.3 • Suppose a radioactive particle is located in a square with sides of unit length. Let Y1 and Y2 denote the particle’s location and assume it is uniformly distributed in the square: 1, 0 y 1, 0 y 1 f(y ,y )= 1 2 1 2 0, elsewhere ⇢ a. Sketch the pdf b. Find F(0.2,0.4) c. Find P (0.1 Y 0.3, 0.0 Y 0.5) 1 2 3/15/15 18 Textbook Example 5.3 Solution 3/15/15 19 Textbook Example 5.3 Solution (cont’d) 3/15/15 20 3/15/15 21 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7th edition, Thomson Brooks/Cole. Textbook Example 5.4 • Gasoline is stored on a FOB in a bulk tank. – Let Y1 denote the proportion of the tank available at the beginning of the week after restocking. – Let Y2 denote the proportion of the tank that is dispensed over the week. – Note that Y1 and Y2 must be between 0 and 1 and y2 must be less than or equal to y1. – Let the joint pdf be 3y , 0 y y 1 f(y ,y )= 1 2 1 1 2 0, elsewhere ⇢ a. Sketch the pdf b. Find P (0 Y 0.5, 0.25 Y ) 1 2 3/15/15 22 Textbook Example 5.4 Solution • First, sketching the pdf: 3/15/15 23 3/15/15 24 Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7th edition, Thomson Brooks/Cole. Textbook Example 5.4 Solution • Now, to solve for the probability: 3/15/15 25 Calculating Probabilities from Multivariate Distributions • Finding probabilities from a bivariate pdf requires double integration over the proper region of distributional support • For multivariate distributions, the idea is the same – you just have to integrate over n dimensions: P (Y y ,Y y ,...,Y y )=F (y ,y ,...,y ) 1 1 2 2 n n 1 2 n y1 y2 yn = f(t ,t ,...,t )dt ...dt ··· 1 2 n n 1 Z1 Z1 Z1 3/15/15 26 Section 5.2 Homework • Do problems 5.1, 5.4, 5.5, 5.9 3/15/15 27 Section 5.3: Marginal and Conditional Distributions • Marginal distributions connect the concept of (bivariate) joint distributions to univariate distributions (such as those in Chapters 3 & 4) – As we’ll see, in the discrete case, the name “marginal distribution” follows from summing across rows or down columns of a table • Conditional distributions are what arise when, in a joint distribution, we fix the value of one of the random variables – The name follows from traditional lexicon like “The distribution of Y1, conditional on Y2 = y2, is…” 3/15/15 28 An Illustrative Example of Marginal Distributions • Remember the dice tossing experiment of the previous section, where we defined – Y1 to be the number of spots on die 1 – Y2 to be the number of spots on die 2 • There are 36 possible joint outcomes (y1, y2), which are (1,1), (1,2), (1,3),…, (6,6) • Assuming the dice are independent, the joint distribution is p(y1,y2)=P (Y1 = y1,Y2 = y2)=1/36, y1 =1, 2,...,6,y2 =1, 2,...,6 3/15/15 29 An Illustrative Example (cont’d) • Now, we might want to know the probability of the univariate event P (Y1 = y1) • Given that all of the joint events are mutually exclusive, we have that 6 P (Y1 = y1)= p(y1,y2) y2=1 – For example, X 6 P (Y1 = 1) = p1(1) = p(1,y2) y =1 X2 = p(1, 1) + p(1, 2) + p(1, 3) + p(1, 4) + p(1, 5) + p(1, 6) =1/36 + 1/36 + 1/36 + 1/36 + 1/36 + 1/36 = 1/6 3/15/15 30 An Illustrative Example (cont’d) • In tabular form: y1 y 1 2 3 4 5 6 Total ) 2 2 y ( 1 1/36 1/36 1/36 1/36 1/36 1/36 1/6 2 p : 2 1/36 1/36 1/36 1/36 1/36 1/36 1/6 2 Y 3 1/36 1/36 1/36 1/36 1/36 1/36 1/6 4 1/36 1/36 1/36 1/36 1/36 1/36 1/6 5 1/36 1/36 1/36 1/36 1/36 1/36 1/6 The marginal 6 1/36 1/36 1/36 1/36 1/36 1/36 1/6 Total 1/6 1/6 1/6 1/6 1/6 1/6 distribution of The joint distribution of The marginal Y1 and Y2: p(y1,y2) 3/15/15 distribution of Y1: p1 (y1) 31 Defining Marginal Probability Functions • Definition 5.4a: Let Y1 and Y2 be discrete random variables with pmf p(y1, y2).