Chapter 5: Multivariate Distributions
Professor Ron Fricker Naval Postgraduate School Monterey, California
3/15/15 Reading Assignment: Sections 5.1 – 5.12 1 Goals for this Chapter
• Bivariate and multivariate probability distributions – Bivariate normal distribution – Multinomial distribution • Marginal and conditional distributions • Independence, covariance and correlation • Expected value & variance of a function of r.v.s – Linear functions of r.v.s in particular • Conditional expectation and variance
3/15/15 2 Section 5.1: Bivariate and Multivariate Distributions
• A bivariate distribution is a probability distribution on two random variables – I.e., it gives the probability on the simultaneous outcome of the random variables • For example, when playing craps a gambler might want to know the probability that in the simultaneous roll of two dice each comes up 1 • Another example: In an air attack on a bunker, the probability the bunker is not hardened and does not have SAM protection is of interest • A multivariate distribution is a probability distribution for more than two r.v.s
3/15/15 3 Joint Probabilities
• Bivariate and multivariate distributions are joint probabilities – the probability that two or more events occur – It’s the probability of the intersection of n 2 events: Y = y , Y = y ,..., Y = y { 1 1} { 2 2} { n n} – We’ll denote the joint (discrete) probabilities as P (Y1 = y1,Y2 = y2,...,Yn = yn) • We’ll sometimes use the shorthand notation p(y1,y2,...,yn) – This is the probability that the event Y = y and { 1 1} the event Y 2 = y 2 and … and the event Y n = y n all occurred{ simultaneously} { }
3/15/15 4 Section 5.2: Bivariate and Multivariate Distributions
• Simple example of a bivariate distribution arises when throwing two dice
– Let Y1 be the number of spots on die 1
– Let Y2 be the number of spots on die 2 • Then there are 36 possible joint outcomes (remember the mn rule: 6 x 6 = 36)
– The outcomes (y1, y2) are (1,1), (1,2), (1,3),…, (6,6) • Assuming the dice are independent, all the outcomes are equally likely, so p(y1,y2)=P (Y1 = y1,Y2 = y2)=1/36, y1 =1, 2,...,6,y2 =1, 2,...,6
3/15/15 5 Bivariate PMF for the Example
3/15/15 6
Source: Wackerly, D.D., W.M. Mendenhall III, and R.L. Scheaffer (2008). Mathematical Statistics with Applications, 7th edition, Thomson Brooks/Cole. Defining Joint Probability Functions (for Discrete Random Variables)
• Definition 5.1: Let Y1 and Y2 be discrete random variables. The joint (or bivariate)
probability (mass) function for Y1 and Y2 is P (Y = y ,Y = y ),