Chapter 2: Random Variables Example 1
Total Page:16
File Type:pdf, Size:1020Kb
ECE511: Analysis of Random Signals Fall 2016 Lecture 1 - 10, 24, 2016 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu Chapter 2: Random Variables Example 1. Tossing a fair coin twice: Ω = fHH;HT;TH;TT g: Define for any ! 2 Ω , X(!)=number of heads in !. X(!) is a random variable. Definition 1. A random variable (RV) is a function X: Ω ! R. Definition 2 (Cumulative distribution function(CDF)). F (χ) = P (X ≤ χ): (1) F(x) 1 3 4 0:5 1 4 x −2 −1 1 2 3 4 . Figure 1: Cumulative distribution function of x Example 2. The cumulative distribution function of x is as (Figure 1) 8 > >0 x < 0; <> 1 0 ≤ x < 1; F (x) = 4 X > 3 > 1 ≤ x < 2; :> 4 1 x ≥ 2: Lemma 1. Properties of CDF (1) lim FX (x) = 0 (2) x→−∞ lim FX (x) = 1; (3) x!+1 1 (2) FX (x) is non-decreasing: x1 ≤ x2 =) FX (x1) ≤ FX (x2) (4) (3) FX (x)is continuous from the right lim FX (x + ϵ) = FX (x); ϵ > 0 (5) ϵ!0 (4) P (a ≤ X ≤ b) = P (X ≤ b) − P (X ≤ a) + P (X = a) (6) = FX (b) − FX (a) + P (X = a) (7) (5) P (X = a) = lim FX (a) − FX (a − "); " > 0 (8) "!0 Definition 3. If random variable X has finite or countable number of values, X is called discrete. Example 3. Non-countable example:<: A set S is countable if you can find a bijection of f. f : S ! N: Definition 4. X is continuous if FX (x) is continuous. Definition 5 (Probability density function(pdf)). dF (x) f (x) = X (x is continuous): (9) X dx Example 4. Gaussian random variable: Normal/ Gaussian Distribution. f(X)(x) σ . x µ Figure 2: Gaussian distribution pdf. By definition, −(x − µ)2 1 2 fX (x) = p e 2σ 2πσ2 2 Therefore, Z a FX (a) = P (x ≤ a) = fX (x)dx; −∞ Z −(x − µ)2 1 a = p e 2σ2 dx: 2πσ2 −∞ We should always have: Z +1 fX (x)dx = 1: −∞ Definition 6 (mean, variance of a RV X). For the continuous case: Z +1 E(X) = µ = xfX (x)dx; Z−∞ +1 2 2 V (X) = σ = (x − µ) fX (x)dx: −∞ For the discrete case: X+1 E(X) = µ = xiP (X = xi); i=−∞ X+1 2 2 V (X) = σ = (xi − µ) P (X = xi): i=−∞ Example 5. X is uniformly distributed in (0; 1]: 8 > <R0 x < 0; x FX (x) = 1dx = x 0 ≤ x < 1; :> 0 1 x ≥ 1: Z 1 1 E(X) = X × 1dx = ; 2 Z0 1 1 1 V (X) = (X − )2 × 1dx = : 0 2 12 Lemma 2 (Probability Density Functions). (1) Uniform X uniform over [a,b]: 8 < 1 ≤ ≤ − if a x b fX (x) = :b a (10) 0 otherwise 1Figure from Wikipedia: https://en.wikipedia.org/wiki/Uniform distribution (continuous) 3 f(x) 1 b−a 0 a b x Figure 3: Uniform distribution.1 (2) Gaussian distribution: −(x − µ)2 1 2 fX (x) = p e 2σ ; (11) 2πσ2 1.0 μ= 0, σ 2= 0.2, μ= 0, σ 2= 1.0, 0.8 μ= 0, σ 2= 5.0, μ= −2, σ 2= 0.5, ) 0.6 x ( 2 σ , μ 0.4 -3 -2 -1 φ 0.2 0.0 −5−3 −2−4 −1 01 2 3 4 5 x Figure 4: Gaussian distribution.2 (3) Exponential distribution: ( −x 1 µ ≥ µ e if x 0 fX (x) = (12) 0 if x < 0 (4) Rayleigh Distribution: −x2 x f (x) = e 2σ2 ; x ≥ 0; (13) X σ2 2Figure from Wikipedia: https://en.wikipedia.org/wiki/Normal distribution 3Figure from Wikipedia: https://en.wikipedia.org/wiki/Exponential distribution 4Figure from Wikipedia: https://en.wikipedia.org/wiki/Rayleigh distribution 4 Figure 5: Exponential distribution.3 1.2 σ=0.5 σ=1 σ=2 σ=3 σ=4 1 0.8 0.6 0.4 0.2 0 0 2 4 6 8 10 Figure 6: Rayleigh distribution.4 (5) Laplacian Distribution: p 1 − 2jxj fX (x) = p e σ : (14) 2σ 5Figure from Wikipedia: https://en.wikipedia.org/wiki/Laplace distribution 5 0.5 μ=0, b=1 μ=0, b=2 0.4 μ=0, b=4 μ=-5, b=4 0.3 0.2 0.1 0 -10 -8 -6 -4 -2 0 2 4 6 8 10 Figure 7: Laplacian distribution.5 1 Example of Discrete Random Variable 1.1 Bernoulli RV flipping a coin, P (H) = p, P (T ) = 1 − p, if head occurs X = 1, if tail occurs X = 0, P (X = 0) = 1 − p; P (X = 1) = p. The CDF of a bernoulli RV is as Figure 8. F(x) 1 0:5 1 − p x −2 −1 1 2 3 4 . Figure 8: Cumulative distribution function of Bernoulli Random Variable 1.2 Binomial distribution Tossing a die n times, P (H) = p; P (T ) = 1 − p. X is number of heads, x 2 f0; 1; : : : ; ng. ( ) n P (X = k) = pk(1 − p)n−k: k 6 Remark 1. Let Yi 2 f0; 1g denote the outcome of tossing the die the ith time X = Y1 + Y2 + ··· + Yn: i.e., a Binomial RV can be thought of as the sum of n independent Bernoulli RV. Example 6 (Random graph). Each edge exists with probability p, X is the number of neighbor of node 1(Figure 9). ( 1; if node 1 is connected to i+1; Yi = 0; otherwise: X = Y1 + Y2 + ··· + Yn−1: So X follows the Binomial distribution. 2 3 4 1 . 5 6 • • • n Figure 9: Random Graphs X Y 1 − p 0. 0 p p 1 − p 1 1 Figure 10: Binary Symmetric Channel with probability of error Pe = p. Example 7 (BSC). Suppose we are transmitting a file of length n. Consider a BSC where the probability of error is p and the probability of receiving the correct bit is 1-p. (Figure 10) What is 7 the probability that we have k errors? ( ) n P (k errors) = pk(1 − p)n−k k Let X represent the number of errors, what is E(X) Xn E(X) = kP (X = k); k=0 ( ) Xn n = k pk(1 − p)n−k; k k=0 ( ) Xn n − 1 = np pk−1(1 − p)n−k; k − 1 k=1 − ( ) nX1 n − 1 = np pk(1 − p)n−k+1; k k=0 = np: Binomial theorem: ( ) Xn n (x + y)n = xkyn−k k k=1 − ( ) nX1 n − 1 (p + 1 − p)n−1 = pk(1 − p)n−k+1 k k=0 = 1: Theorem 1. For any two RVs X1 and X2, Y = X1 + X2, E(Y ) = E(X1) + E(X2): (15) It does not matter whether X1 and X2 are independent or not. 1.3 Geometric distribution You keep tossing a coin until you observe a Head. X is the number of times you have to toss the coin. X 2 f1; 2;::: g; P (X = K) = (1 − p)kp: Example 8 (Binary erasure channel). Suppose you have a BEC channel with feedback. When you get a erasure, you ask the sender to retransmit.(Figure 11) Suppose you pay one dollar for each retransmission. Let X be the amount of money you pay per transmission. 1 E(X) = ; 1 − p 1 = ≈ 1:11$: 0:9 8 X Y 1 − p 0. 0 p erasure E p 1 − p 1 1 Figure 11: Binary Erasure Channel with probability of erasure Pe = 0:1. For geometric distribution, 1 P (H) ≈ ; E(X) which E(X)is the number of coin flips on average. Proof. X1 E(X) = kP (X = k); (16) k=1 X1 = k(1 − p)k−1p; (17) k=1 X1 = p k(1 − p)k−1: (18) k=1 Recall that for jxj < 1, 1 X 1 xk = ; (19) 1 − x k=0 1 d X 1 kxk−1 = ; (20) dk (1 − x)2 k=1 1 X 1 k(1 − p)k = : (21) p2 k=1 So, 1 E(X) = p ; (22) p2 1 = : (23) p 9 1.4 Poisson distribution Suppose a server receives λ searches per second on average. The probability that the server receives k searches for this second is λk P (X = k) = e−λ; k = 0; 1; 2;:::; 1: (24) k! For poisson distribution 1 1 X X λk P (X = k) = e−λ (25) k! k=0 k=0 1 X λk = e−λ (26) k! k=0 = 1: (27) E(X) = λ. (28) Example 9 (Interpretation of poisson distribution as an arrival experiment). suppose average of arrival cumtomers per second is λ. Suppose server goes down if X ≥ 100. We want to find the probability of P (X = k). P (server going down) = P (X ≥ 100): 1 We divide the one second to n intervals, each length of the interval is n second. The probability p λ of getting requests in small interval is n . ::: 0 1 2 3 4 5 n Figure 12: one second divided into n intervals. Now we can consider it to be Bernoulli distribution with p. ( ) n P (X = k) = pk(1 − p)n−k; (29) (k) n λ λ = ( )k(1 − )n−k (30) k n n nk p ≈ ( )k(1 − p)n; (31) k! 1 − p 1 = (np)ke−np; (32) k! 1 = λke−λ; (33) k! λk = e−λ: (34) k! 10 We get (31) because of ( ) n 1 = n(n − 1) ::: (n − k + 1); (35) k k! nk ≈ ; (k is a constant and n goes to infinity): (36) k! This means we can approximate Binomial(n,p) by Poisson with λ = np (if n is very large).