<<

Random Variables (Chapter 2)

Random variable = A real-valued function of an

X = f(outcome)

Domain of X: space of the experiment.

Ex: Consider an experiment consisting of 3 Bernoulli trials.

Bernoulli trial = Only two possible outcomes – success (S) or failure (F).

• “IF” statement: if … then “S” else “F” • Examine each component. S = “acceptable”, F = “defective”. • Transmit binary digits through a communication channel. S = “digit received correctly”, F = “digit received incorrectly”.

Suppose the trials are independent and each trial has a ½ of success.

X = # successes observed in the experiment.

Possible values: Outcome Value of X (SSS) (SSF) (SFS) … … (FFF)

Random variable:

• Assigns a real number to each outcome in S. • Denoted by X, Y, Z, etc., and its values by x, y, z, etc. • Its value depends on chance. • Its value becomes available once the experiment is completed and the outcome is known. • of its values are determined by the probabilities of the outcomes in the .

Probability distribution of X = A table, formula or a graph that summarizes how the total probability of one is distributed over all the possible values of X.

In the Bernoulli trials example, what is the distribution of X?

1 Two types of random variables:

Discrete rv = Takes finite or countable number of values

• Number of jobs in a queue • Number of errors • Number of successes, etc.

Continuous rv = Takes all values in an interval – i.e., it has uncountable number of values.

• Execution time • Waiting time • Miles per gallon • Distance traveled, etc.

Discrete random variables

X = A discrete rv.

• Probability mass function (PMF) of X = of X. • Notation: p(x) = P(X = x) = probability that the rv X takes the value x. • Once we have the PMF, we can compute any probability of interest. • 0 ≤ p(x) ≤ 1, total probability = ∑x p(x) = 1

Ex: “Pick Six” in Texas Lottery. In this Lottery, a player picks six numbers from the numbers 1 through 50 with no repetitions and pays $1.00. On Wednesday evenings, the Texas State Lottery Commission televises one of their employees randomly picking six balls without replacement, each with a number from 1 to 50 on it, from a large hopper. The player is paid if his/her number matches with the selected balls for three or more numbers. Suppose X = # of matches a player has with the selected balls.

(a) What is the PMF of X?

Value of X # of outcomes in sample space p(x) (# of matches) that give this value of X 0 = 0.44422536452 1 = 0.41005418264 2 = 0.12814193207 3 = 0.01666886921

4 =

0.00089297514

5 =

0.00001661349

6 =

0.00000006293

Total 1.00000000000

2 In this example, in general, p(x) = P(X = x) =

(b) What is the probability of winning nothing?

Cumulative distribution function (CDF) of X is defined as

F(x) = P(X ≤ x) = ∑ y≤x p(y)

• Gives the total probability on the left of x or the probability that the observed value of X is at most x. • F(x) is non-decreasing. • Jumps by p(x) at the point x. • F(– ∞) = 0, and F(+ ∞) = 1 • P( a < X ≤ b) = F(b) – F(a)

Ex: Consider a rv X whose CDF is the following.

 0 x < 1  1/6 1≤ x < 2  2/6 2≤ x < 3  F(x) =  3/6 3≤ x < 4  4/6 4≤ x < 5  5/ 6 5 ≤ x < 6   16≤ x

(a) What is the PMF of X?

(b) Find P(X will be greater than 3, but no more than 5).

3 Random vectors and joint distributions

If X, Y = random variables then (X, Y) = random vector.

It has joint PMF p(x, y) = P{(X, Y) = (x, y)} = P(X = x and Y = y)

• 0 ≤ p(x, y) ≤ 1 • ∑(x,y) p(x, y) = total probability = 1

• P[(X, Y) ∈ A] = ∑(x,y)∈A p(x, y)

From the joint PMF, we can obtain the marginal PMF’s of X and Y:

From the marginal PMF’s, we may not be able to obtain the joint PMF.

Ex: Consider a program with two modules, having module execution times X and Y minutes, respectively. The following table describes their joint PMF:

p(x,y) y=1 y=2 y=3 y=4 x=1 1/4 1/161/16 1/8 x=2 1/16 1/8 1/4 1/16

(a) Find the probability that P(Y ≥ X).

(b) Suppose that the two modules are run concurrently. Find the probability that the total execution time of the program is 3 minutes.

4

(c) Find the marginal PMF of X.

Independent random variables

Random variables X and Y are independent if for every pair (x, y)

p(x, y) = i.e., {X = x} and {Y = y} are independent events for all (x, y).

• Two rv’s are independent if their joint PMF is the product of their marginals. • To show independence, verify the above equality for all (x, y). • To show dependence, find one pair (x, y) that violates it.

Ex cont’d: The marginal PMF’s of X and Y are:

x 1 2 y 1 2 3 4 P(X=x) 0.5 0.5 P(Y=y) 5/16 3/16 5/16 3/16

Are X and Y independent?

Note: The concept of joint distribution and independence can be extended to the case of more than two rv’s. See sections 2.8-2.9 of the text.

5