Lecture Notes 6 Random Processes • Definition and Simple Examples

Lecture Notes 6 Random Processes • Definition and Simple Examples

Lecture Notes 6 Random Processes • Definition and Simple Examples • Important Classes of Random Processes ◦ IID ◦ Random Walk Process ◦ Markov Processes ◦ Independent Increment Processes ◦ Counting processes and Poisson Process • Mean and Autocorrelation Function • Gaussian Random Processes ◦ Gauss–Markov Process EE 278: Random Processes Page 6–1 Random Process • A random process (RP) (or stochastic process) is an infinite indexed collection of random variables {X(t): t ∈ T }, defined over a common probability space • The index parameter t is typically time, but can also be a spatial dimension • Random processes are used to model random experiments that evolve in time: ◦ Received sequence/waveform at the output of a communication channel ◦ Packet arrival times at a node in a communication network ◦ Thermal noise in a resistor ◦ Scores of an NBA team in consecutive games ◦ Daily price of a stock ◦ Winnings or losses of a gambler EE 278: Random Processes Page 6–2 Questions Involving Random Processes • Dependencies of the random variables of the process ◦ How do future received values depend on past received values? ◦ How do future prices of a stock depend on its past values? • Long term averages ◦ What is the proportion of time a queue is empty? ◦ What is the average noise power at the output of a circuit? • Extreme or boundary events ◦ What is the probability that a link in a communication network is congested? ◦ What is the probability that the maximum power in a power distribution line is exceeded? ◦ What is the probability that a gambler will lose all his captial? • Estimation/detection of a signal from a noisy waveform EE 278: Random Processes Page 6–3 Two Ways to View a Random Process • A random process can be viewed as a function X(t, ω) of two variables, time t ∈T and the outcome of the underlying random experiment ω ∈ Ω ◦ For fixed t, X(t, ω) is a random variable over Ω ◦ For fixed ω, X(t, ω) is a deterministic function of t, called a sample function X(t, w1) t X(t, w2) t X(t, w3) t t1 t2 X(t1, w) X(t2, w) EE 278: Random Processes Page 6–4 Discrete Time Random Process • A random process is said to be discrete time if T is a countably infinite set, e.g., ◦ N = {0, 1, 2,...} ◦ Z = {..., −2, −1, 0, +1, +2,...} • In this case the process is denoted by Xn, for n ∈ N , a countably infinite set, and is simply an infinite sequence of random variables • A sample function for a discrete time process is called a sample sequence or sample path • A discrete-time process can comprise discrete, continuous, or mixed r.v.s EE 278: Random Processes Page 6–5 Example n • Let Z ∼ U[0, 1], and define the discrete time process Xn = Z for n ≥ 1 • Sample paths: x 1 n 2 1 1 Z = 2 4 1 8 1 16 n xn 1 1 Z = 4 4 1 16 1 64 n xn Z = 0 00 0 ... n 1234567 ... EE 278: Random Processes Page 6–6 n • First-order pdf of the process: For each n, Xn = Z is a r.v.; the sequence of pdfs of Xn is called the first-order pdf of the process xn 1 z 0 1 Since Xn is a differentiable function of the continuous r.v. Z, we can find its pdf as 1 1 1 f (x)= = xn−1 , 0 ≤ x ≤ 1 Xn nx(n−1)/n n EE 278: Random Processes Page 6–7 Continuous Time Random Process • A random process is continuous time if T is a continuous set • Example: Sinusoidal Signal with Random Phase X(t)= α cos(ωt + Θ) , t ≥ 0 where Θ ∼ U[0, 2π] and α and ω are constants • Sample functions: x(t) α Θ = 0 π π 3π 2π t 2ω ω 2ω ω x(t) Θ = π 4 t x(t) Θ = π 2 t EE 278: Random Processes Page 6–8 • The first-order pdf of the process is the pdf of X(t)= α cos(ωt + Θ). In an earlier homework exercise, we found it to be 1 fX(t)(x)= , −α<x< +α απ 1 − (x/α)2 The graph of the pdf is shownp below fX(t)(x) 1 απ x −α 0 +α Note that the pdf is independent of t. (The process is stationary) EE 278: Random Processes Page 6–9 Specifying a Random Process • In the above examples we specified the random process by describing the set of sample functions (sequences, paths) and explicitly providing a probability measure over the set of events (subsets of sample functions) • This way of specifying a random process has very limited applicability, and is suited only for very simple processes • A random process is typically specified (directly or indirectly) by specifying all its n-th order cdfs (pdfs, pmfs), i.e., the joint cdf (pdf, pmf) of the samples X(t1),X(t2),...,X(tn) for every order n and for every set of n points t1, t2,...,tn ∈T • The following examples of important random processes will be specified (directly or indirectly) in this manner EE 278: Random Processes Page 6–10 Important Classes of Random Processes • IID process: {Xn : n ∈ N } is an IID process if the r.v.s Xn are i.i.d. Examples: ◦ Bernoulli process: X1,X2,...,Xn,... i.i.d. ∼ Bern(p) ◦ Discrete-time white Gaussian noise (WGN): X1,...,Xn,... i.i.d. ∼ N (0, N) • Here we specified the n-th order pmfs (pdfs) of the processes by specifying the first-order pmf (pdf) and stating that the r.v.s are independent • It would be quite difficult to provide the specifications for an IID process by specifying the probability measure over the subsets of the sample space EE 278: Random Processes Page 6–11 The Random Walk Process • Let Z1,Z2,...,Zn,... be i.i.d., where 1 +1 with probability 2 Zn = 1 (−1 with probability 2 • The random walk process is defined by X0 = 0 n Xn = Zi , n ≥ 1 i=1 X • Again this process is specified by (indirectly) specifying all n-th order pmfs • Sample path: The sample path for a random walk is a sequence of integers, e.g., 0, +1, 0, −1, −2, −3, −4,... or 0, +1, +2, +3, +4, +3, +4, +3, +4,... EE 278: Random Processes Page 6–12 Example: xn 3 2 1 n 1 2 3 4 5 6 7 8 9 10 −1 −2 −3 zn : 1 1 1 −1 1 −1 −1 −1 −1 −1 • First-order pmf: The first-order pmf is P{Xn = k} as a function of n. Note that k ∈ {−n, −(n − 2),..., −2, 0, +2,..., +(n − 2), +n} for n even k ∈ {−n, −(n − 2),..., −1, +1, +3,..., +(n − 2), +n} for n odd Hence, P{Xn = k} = 0 if n + k is odd, or if k < −n or k > n EE 278: Random Processes Page 6–13 Now for n + k even, let a be the number of +1’s in n steps, then the number of −1’s is n − a, and we find that n + k k = a − (n − a) = 2a − n ⇒ a = 2 Thus 1 P{Xn = k} = P{2(n + k) heads in n independent coin tosses} n −n = n+k · 2 for n + k even and −n ≤ k ≤ n 2 P{Xn = k}, n even kk ... −6 −4 −2 0 +2 +4 +6 ... EE 278: Random Processes Page 6–14 Markov Processes • A discrete-time random process Xn is said to be a Markov process if the process future and past are conditionally independent given its present value • Mathematically this can be rephrased in several ways. For example, if the r.v.s {Xn : n ≥ 1} are discrete, then the process is Markov iff n−1 n x pXn+1|X (xn+1|xn, )= pXn+1|Xn(xn+1|xn) for every n • IID processes are Markov • The random walk process is Markov. To see this consider n n n n P{Xn+1 = xn+1 | X = x } = P{Xn + Zn+1 = xn+1 | X = x } = P{Xn + Zn+1 = xn+1 | Xn = xn} = P{Xn+1 = xn+1 | Xn = xn} EE 278: Random Processes Page 6–15 Independent Increment Processes • A discrete-time random process {Xn : n ≥ 0} is said to be independent increment if the increment random variables Xn1, Xn2 − Xn1, ..., Xnk − Xnk−1 are independent for all sequences of indices such that n1 < n2 < ··· < nk • Example: Random walk is an independent increment process because n1 n2 nk Xn1 = Zi , Xn2−Xn1 = Zi , ..., Xnk−Xnk−1 = Zi i=1 i=n +1 i=n +1 X X1 Xk−1 are independent because they are functions of independent random vectors • The independent increment property makes it easy to find the n-th order pmfs of a random walk process from knowledge only of the first-order pmf EE 278: Random Processes Page 6–16 • Example: Find P{X5 = 3, X10 = 6, X20 = 10} for random walk process {Xn} Solution: We use the independent increment property as follows P{X5 = 3, X10 = 6, X20 = 10} = P{X5 = 3, X10 − X5 = 3, X20 − X10 = 4} = P{X5 = 3}P{X5 = 3}P{X10 = 4} 5 5 10 = 2−5 2−5 2−10 = 3000 · 2−20 4 4 7 • In general if a process is independent increment, then it is also Markov. To see this let Xn be an independent increment process and define n T ∆X =[X1, X2 − X1,...,Xn − Xn−1] Then n n n n x X x pXn+1|X (xn+1 | ) = P{Xn+1 = xn+1 | = } n n = P{Xn+1 − Xn + Xn = xn+1 | ∆X = ∆x , Xn = xn} = P{Xn+1 = xn+1 | Xn = xn} • The converse is not necessarily true, e.g., IID processes are Markov but not independent increment EE 278: Random Processes Page 6–17 • The independent increment property can be extended to continuous-time processes: A process X(t), t ≥ 0, is said to be independent increment if X(t1), X(t2) − X(t1),..., X(tk) − X(tk−1) are independent for every 0 ≤ t1 < t2 <...<tk and every k ≥ 2 • Markovity can also be extended to continuous-time processes: A process X(t) is said to be Markov if X(tk+1) and (X(t1),...,X(tk−1)) are conditionally independent given X(tk) for every 0 ≤ t1 < t2 <...<tk < tk+1 and every k ≥ 2 EE 278: Random Processes Page 6–18 Counting Processes and Poisson Process • A continuous-time random process N(t), t ≥ 0, is said to be a counting process if N(0) = 0 and N(t)= n, n ∈ {0, 1, 2,...}, is the number of events from 0 to t (hence N(t2) ≥ N(t1) for every t2 > t1 ≥ 0) Sample path of a counting process: n(t) 5 4 3 2 1 0 t 0 t1 t2 t3 t4 t5 t1, t2,..

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    32 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us