Lecture Notes 7 Random Processes • Definition • IID Processes

Lecture Notes 7 Random Processes • Definition • IID Processes

Lecture Notes 7 Random Processes Definition • IID Processes • Bernoulli Process • Binomial Counting Process ◦ Interarrival Time Process ◦ Markov Processes • Markov Chains • Classification of States ◦ Steady State Probabilities ◦ Corresponding pages from B&T: 271–281, 313–340. EE 178/278A: Random Processes Page 7–1 Random Processes A random process (also called stochastic process) X(t): t is an infinite • collection of random variables, one for each value of{ time t ∈ T }(or, in some cases distance) ∈T Random processes are used to model random experiments that evolve in time: • Received sequence/waveform at the output of a communication channel ◦ Packet arrival times at a node in a communication network ◦ Thermal noise in a resistor ◦ Scores of an NBA team in consecutive games ◦ Daily price of a stock ◦ Winnings or losses of a gambler ◦ Earth movement around a fault line ◦ EE 178/278A: Random Processes Page 7–2 Questions Involving Random Processes Dependencies of the random variables of the process: • How do future received values depend on past received values? ◦ How do future prices of a stock depend on its past values? ◦ How well do past earth movements predict an earthquake? ◦ Long term averages: • What is the proportion of time a queue is empty? ◦ What is the average noise power generated by a resistor? ◦ Extreme or boundary events: • What is the probability that a link in a communication network is congested? ◦ What is the probability that the maximum power in a power distribution line ◦ is exceeded? What is the probability that a gambler will lose all his capital? ◦ EE 178/278A: Random Processes Page 7–3 Discrete vs. Continuous-Time Processes The random process X(t): t is said to be discrete-time if the index set • is countably infinite,{ e.g., 1∈, 2 T,... } or ..., 2, 1, 0, +1, +2,... : T { } { − − } The process is simply an infinite sequence of r.v.s X1,X2,... ◦ An outcome of the process is simply a sequence of numbers ◦ The random process X(t): t is said to be continuous-time if the index • set is a continuous{ set, e.g.,∈(0 T, } ) or ( , ) T ∞ −∞ ∞ The outcomes are random waveforms or random occurances in continuous ◦ time We only discuss discrete-time random processes: • IID processes ◦ Bernoulli process and associated processes ◦ Markov processes ◦ Markov chains ◦ EE 178/278A: Random Processes Page 7–4 IID Processes A process X1,X2,... is said to be independent and identically distributed (IID, • or i.i.d.) if it consists of an infinite sequence of independent and identically distributed random variables Two important examples: • Bernoulli process: X1,X2,... are i.i.d. Bern(p), 0 <p< 1, r.v.s. Model for ◦ random phenomena with binary outcomes, such as: Sequence of coin flips ∗ Noise sequence in a binary symmetric channel ∗ The occurrence of random events such as packets (1 corresponding to an ∗ event and 0 to a non-event) in discrete-time Binary expansion of a random number between 0 and 1 ∗ Discrete-time white Gaussian noise (WGN) process: X1,X2,... are i.i.d. ◦ (0, N) r.v.s. Model for: NReceiver noise in a communication system ∗ Fluctuations in a stock price ∗ EE 178/278A: Random Processes Page 7–5 Useful properties of an IID process: • Independence: Since the r.v.s in an IID process are independent, any two ◦ events defined on sets of random variables with non-overlapping indices are independent Memorylessness: The independence property implies that the IID process is ◦ memoryless in the sense that for any time n, the future Xn+1,Xn+2,... is independent of the past X1,X2,...,Xn Fresh start: Starting from any time n, the random process Xn,Xn+1,... ◦ behaves identically to the process X1,X2,..., i.e., it is also an IID process with the same distribution. This property follows from the fact that the r.v.s are identically distributed (in addition to being independent) EE 178/278A: Random Processes Page 7–6 The Bernoulli Process The Bernoulli process is an infinite sequence X1,X2,... of i.i.d. Bern(p) r.v.s • The outcome from a Bernoulli process is an infinite sequence of 0s and 1s • A Bernoulli process is often used to model occurrences of random events; • Xn = 1 if an event occurs at time n, and 0, otherwise Three associated random processes of interest: • Binomial counting process: The number of events in the interval [1, n] ◦ Arrival time process: The time of event arrivals ◦ Interarrival time process: The time between consecutive event arrivals ◦ We discuss these processes and their relationships • EE 178/278A: Random Processes Page 7–7 Binomial Counting Process Consider a Bernoulli process X1,X2,... with parameter p • We are often interested in the number of events occurring in some time interval • For the time interval [1, n], i.e., i = 1, 2,...,n, we know that the number of • occurrences n Wn = Xi B(n, p) ∼ i=1 The sequence of r.v.s W1,W2,... is referred to as a Binomial counting process • The Bernoulli process can be obtained from the Binomial counting process as: • Xn = Wn Wn−1, for n = 1, 2,..., − where W0 = 0 Outcomes of a Binomial process are integer valued stair-case functions • EE 178/278A: Random Processes Page 7–8 Wn 10 9 8 7 6 5 4 3 2 1 n 12 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 Xn 1 0 0 1 10 1 1 0 1 1 0 0 0 0 1 0 1 0 Note that the Binomial counting process is not IID • By the fresh-start property of the Bernoulli process, for any n 1 and k 1, • the distribution of the number of events in the interval [k + 1,≥ n + k] is identical≥ to that of [1, n], i.e., Wn and (Wk+n Wk) are identically distributed − EE 178/278A: Random Processes Page 7–9 Example: Packet arrivals at a node in a communication network can be modeled • by a Bernoulli process with p = 0.09. 1. What is the probability that 3 packets arrive in the interval [1, 20], 6 packets arrive in [1, 40] and 12 packets arrive in [1, 80]? 2. The input queue at the node has a capacity of 103 packets. A packet is dropped if the queue is full. What is the probability that one or more packets are dropped in a time interval of length n = 104? Solution: Let Wn be the number of packets arriving in interval [1, n]. 1. We want to find the following probability P W20 = 3,W40 = 6,W80 = 12 , { } which is equal to P W20 = 3,W40 W20 = 3,W80 W40 = 6 { − − } By the independence property of the Bernoulli process this is equal to P W20 = 3 P W40 W20 = 3 P W80 W40 = 6 { } { − } { − } EE 178/278A: Random Processes Page 7–10 Now, by the fresh start property of the Bernoulli process P W40 W20 = 3 = P W20 = 3 , and { − } { } P W80 W40 = 6 = P W40 = 6 { − } { } Thus 2 P W20 = 3,W40 = 6,W80 = 12 = (P W20 = 3 ) P W40 = 6 { } { } × { } Now, using the Poisson approximation of Binomial, we have 3 20 3 17 (1.8) −1.8 P W20 = 3 = (0.09) (0.91) e = 0.1607 { } 3 ≈ 3! 6 40 6 34 (3.6) −3.6 P W40 = 6 = (0.09) (0.91) e = 0.0826 { } 6 ≈ 6! Thus 2 P W20 = 3,W40 = 6,W80 = 12 (0.1607) 0.0826 = 0.0021 { }≈ × EE 178/278A: Random Processes Page 7–11 2. The probability that one or more packets are dropped in a time interval of length n = 104 is 4 10 4 3 10 n 104−n P W 4 > 10 = (0.09) (0.91) { 10 } n n=1001 Difficult to compute, but we can use the CLT! 4 10 2 Since W104 = i=1 Xi and E(X) = 0.09 and σX = 0.09 0.91 = 0.0819, we have × 4 4 10 10 3 1 (Xi 0.09) 10 900 P X > 103 = P − > − i 100 √ √ i=1 i=1 0.0819 100 0.0819 104 1 (Xi 0.09) = P − > 3.5 100 0.286 i=1 Q(3.5) = 2 10−4 ≈ × EE 178/278A: Random Processes Page 7–12 Arrival and Interarrival Time Processes Again consider a Bernoulli process X1,X2,... as a model for random arrivals of • events Let Yk be the time index of the kth arrival, or the kth arrival time, i.e., smallest • n such that Wn = k Define the interarrival time process associated with the Bernoulli process as • T1 = Y1 and Tk = Yk Yk−1, for k = 2, 3,... − Thus the kth arrival time is given by: Yk = T1 + T2 + ... + Tk 0 0 1 0 0 0 1 0 0 1 0 1 0 0 0 0 1 n T1 (Y1) T2 T3 T4 T5 Y2 Y3 EE 178/278A: Random Processes Page 7–13 Let’s find the pmf of Tk: • First, the pmf of T1 is the same as the number of coin flips until a head (i.e, a 1) appears. We know that this is Geom(p). Thus T1 Geom(p) ∼ Now, having an event at time T1, the future is a fresh starting Bernoulli process. Thus, the number of trials T2 until the next event has the same pmf as T1 Moreover, T1 and T2 are independent, since the trials from 1 to T1 are independent of the trials from T1 + 1 onward. Since T2 is determined exclusively by what happens in these future trials, it’s independent of T1 Continuing similarly, we conclude that T1,T2,..

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    24 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us