Problem 1: Apply the Multiplication Rule
Total Page:16
File Type:pdf, Size:1020Kb
Jamie R. Wieland 5-四月-18 IE 336
Quiz 0 Solutions
Problem 1: Apply the Multiplication Rule P( A� B) P( B| A) P( A ) = ( 0.2) ( 0.3) = 0.06
Problem 2: Recognize that X is a continuous random variable (rv) and then apply the definition of E ( X ) for continuous rv’s.
( ) E ( X) = x fX x dx - We can change the limits of integration
because X is defined on [ 0,1 ]
1 1 1
( ) 2 =蝌x fX x dx = x2 x dx = 2 x dx 0 0 0 1 轾x 3 轾1 2 =2犏 = 2犏 - 0 = 臌犏30 臌 3 3
Problem 3: Recognize that X is a discrete rv.
P( X< -0.3) = P( ( X = - 1) �( X - 0.5) ) these events are disjoint =P( X = -1) + P( X = - 0.5) 1 1 3 = + = 8 4 8
Problem 4. Given that Y =2, there is only one possible value for X, which is 1 with probability 1
E( X | Y = 2) = (1)(1) = 1
Problem 5. If two random variables are independent, then their joint distribution is just the product of the marginal distributions. ( ) ( ) ( ) ( ) fX1, X 2 x 1, x 2= f X 1 x 1 f X 2 x 2 , for every x 1 , x 2 Problem 6. Because these two random variables are independent, conditioning on one of them does not provide us with any additional information. So the conditional distribution is the same as the unconditional distribution in this case – this is what it means (in more detail) for the joint distribution to be the product of the marginal distributions.
f( 3.77, x ) f( x | 3.77 ) = X1, X 2 2 X2| X 1 2 ( ) fX 1 3.77 f( 3.77 ) f( x ) = X1 X 2 2 ( ) fX 1 3.77 ( ) = fX 2 x2
Problem 7. We know that the expected value of the sample average is 6, because it is just the expectation of a linear combination of random variables, and we do not need independence to show this.
骣1 n E(Y) = E Y i 桫n i =1 1 骣n = E Y i n 桫i =1 1 骣n = E (Y )i n 桫i =1 1 骣n = 6 n 桫i =1 1 = ( 6n ) n = 6
Problem 8: In this case the random variables are i.i.d so, in addition to the expected value, we can also find the variance
骣1 n V( Y) = V Y i 桫n i =1 1 骣n = 2 V Y i n 桫i =1 1 骣n = 2 V( Y )i n 桫i =1 1 骣n = 2 36 n 桫i =1 1 = ( 36n ) n 2 36 = n
Further note that the distribution of a sum of i.i.d. random variables (such as an average) is asymptotically normal (from the central limit theorem):
6 Y~ N = 6, = ( mY s Y n )