POISSON PROCESSES in This Section, We Study Two Important Properties

Total Page:16

File Type:pdf, Size:1020Kb

POISSON PROCESSES in This Section, We Study Two Important Properties MATH 171 LECTURE NOTE 3: POISSON PROCESSES HANBAEK LYU 1. EXPOENTIALAND POISSON RANDOM VARIABLES In this section, we study two important properties of exponential and Poisson random vari- ables, which will be crucial when we study Poisson processes from the following section. Example 1.1 (Exponential RV). X is an exponential RV with rate ¸ (denoted by X Exp(¸)) if it » has PDF ¸x f (x) ¸e¡ 1(x 0). (1) X Æ ¸ Integrating the PDF gives its CDF ¸x P(X x) (1 e¡ )1(x 0). (2) · Æ ¡ ¸ The following complimentary CDF for exponential RVs will be useful: ¸x P(X x) e¡ 1(x 0). (3) ¸ Æ ¸ N Exercise 1.2. Let X Exp(¸). Show that E(X ) 1/¸ and Var(X ) 1/¸2. » Æ Æ Minimum of independent exponential RVs is again an exponential RV. Exercise 1.3. Let X Exp(¸ ) and X Exp(¸ ) and suppose they are independent. Define Y 1 » 1 2 » 2 Æ min(X , X ). Show that Y Exp(¸ ¸ ). (Hint: Compute P(Y y).) 1 2 » 1 Å 2 ¸ The following example is sometimes called the ‘competing exponentials’. Example 1.4. Let X Exp(¸ ) and X Exp(¸ ) be independent exponential RVs. We will show 1 » 1 2 » 2 that ¸1 P(X1 X2) (4) Ç Æ ¸ ¸ 1 Å 2 using the iterated expectation. Using iterated expectation for probability, Z 1 ¸1x1 P(X1 X2) P(X1 X2 X1 x1)¸1e¡ dx1 (5) Ç Æ 0 Ç j Æ Z 1 ¸1x1 P(X2 x1)¸1e¡ dx1 (6) Æ 0 È Z 1 ¸2x1 ¸1x1 ¸1 e¡ e¡ dx1 (7) Æ 0 Z ¸ 1 (¸1 ¸2)x1 1 ¸1 e¡ Å dx1 . (8) Æ 0 Æ ¸ ¸ 1 Å 2 N When we add up independent continuous RVs, we can compute the PDF of the resulting RV by using the convolution formula or moment generating functions. In the following exercise, we compute the PDF of the sum of independent exponentials. 1 2 HANBAEK LYU Exercise 1.5 (Sum of i.i.d. Exp is Erlang). Let X , X , , X Exp(¸) be independent exponential 1 2 ¢¢¢ n » RVs. 2 ¸z (i) Show that fX1 X2 (z) ¸ ze¡ 1(z 0). Å Æ 1 3 2¸ ¸z (ii) Show that fX X X (z) 2¡ ¸ z e¡ 1(z 0). 1Å 2Å 3 Æ ¸ (iii) Let S X X X . Use induction to show that S Erlang(n,¸), that is, n Æ 1 Å 2 Å ¢¢¢ Å n n » n n 1 ¸z ¸ z ¡ e¡ fS (z) . (9) n Æ (n 1)! ¡ Exponential RVs will be the builing block of the Poisson processes, because of their ‘memoryless property’. Exercise 1.6 (Memoryless property of exponential RV). A continuous positive RV X is say to have memoryless property if P(X t t ) P(X t )P(X t ) x ,x 0. (10) ¸ 1 Å 2 Æ ¸ 1 ¸ 2 8 1 2 ¸ (i) Show that (10) is equivalent to P(X t t X t ) P(X t ) x ,x 0. (11) ¸ 1 Å 2 j ¸ 2 Æ ¸ 1 8 1 2 ¸ (ii) Show that exponential RVs have memoryless property. (iii) Suppose X is continuous, positive, and memoryless. Let g(t) logP(X t). Show that g is Æ ¸ continuous at 0 and g(x y) g(x) g(y) for all x, y 0. (12) Å Æ Å ¸ Using Exercise 1.7, conclude that X must be an exponential RV. Exercise 1.7. Let g : R 0 R be a function with the property that g(x y) g(x) g(y) for all ¸ ! Å Æ Å x, y 0. Further assume that g is continuous at 0. In this exercise, we will show that g(x) cx for ¸ Æ some constant c. (i) Show that g(0) g(0 0) g(0) g(0). Deduce that g(0) 0. Æ Å Æ Å Æ (ii) Show that for all integers n 1, g(n) ng(1). ¸ Æ (iii) Show that for all integers n,m 1, ¸ ng(1) g(n 1) g(m(n/m)) mg(n/m). (13) Æ ¢ Æ Æ Deduce that for all nonnegative rational numbers r , we have g(r ) r g(1). Æ (iv) Show that g is continuous. (v) Let x be nonnegative real number. Let r be a sequence of rational numbers such that r x k k ! as k . By using (iii) and (iv), show that ! 1 µ ¶ g(x) g lim rk lim g(rk ) g(1) lim rk x g(1). (14) Æ k Æ k Æ k Æ ¢ !1 !1 !1 Lastly, we introduce the Poisson RVs and record some of their nice properties. Example 1.8 (Poisson RV). A RV X is a Poisson RV with rate ¸ 0 if È k ¸ ¸ e¡ P(X k) (15) Æ Æ k! for all nonnegative integers k 0. We write X Poisson(¸). ¸ » Y 4/ 1/ 1/ 1/ 3 3/ 0 5/ 1/ 2 2/ 3/ 1/ 5/ 1 1 1/ 1/ 2/ 2/ 0 X 0 1 2 1 3 푥 + 푦 = 7 푥 + 푦 = 6 푥 + 푦 = 12 푥 + 푦 = 5 푥 + 푦 = 11 푥 + 푦 = 4 푥 + 푦 = 10 푥 + 푦 = 3 푥 + 푦 = 9 푥 + 푦 = 2 푥 + 푦 = 8 MATH 171 LECTURE NOTE 3 3 1 2 3 4 5 6 Exercise1 1 1.9. 1Let X1 Poisson(1 1 ¸). Show that E(X ) Var(X ) ¸. » Æ Æ Exercise 1.10 (Sum of ind. Poisson RVs is Poisson). Let X Poisson(¸ ) and Y Poisson(¸ ) be » 1 » 2 independent Poisson RVs. Show that X Y Poisson(¸ ¸ ). Å » 1 Å 2 휏 휏 휏 휏 2. POISSON PROCESSES AS AN ARRIVAL PROCESS ⋯ An arrival process is a sequence of strictly increasing RVs 0 T1 T2 . For each integer 푇 푇 푇 푇 Ç Ç Ç ¢¢¢ k 1, its kth inter-arrival time is defined by ¿k Tk Tk 11(k 2). For a given arrival process ¸ Æ ¡ ¡ ¸ (Tk )k 1, the associated counting process (N(t))t 0 is defined by ¸ ¸ X1 N(t) 1(Tk t) #(arrivals up to time t). (16) 2 3 1 0 1 4 0Æ k 1 2· Æ Æ Note that these three processes (arrival times, inter-arrival times, and counting) determine each other: 2 3 3 4 0 1 2 2 (Tk )k 1 (¿k )k 1 (N(t))t 0. (17) ¸ () ¸ () ¸ Exercise 2.1. 6.12 Let (Tk )k 1 be any arrival process and let (N(t))t 0 be its associated counting ¸ process. Show that these two¸ processes determine each other by the following relation {T t} {N(t) n}. (18) n · Æ ¸ In words, nth customer arrives by time t if and only if at least n customers arrive up to time t. 푁 4 휏 3 휏 2 휏 푁(푠) = 3 1 휏 0 푇 푇 푇 푠 푇 푡 FIGURE 1. Illustration of a continuous-time arrival process (Tk )k 1 and its associated ¸ counting process (N(t))t 0. ¿k ’s denote inter-arrival times. N(t) 3 for T3 t T4. ¸ ´ Ç · Now we define Poisson process. Definition 2.2 (Poisson process). An arrival process (Tk )k 1 is a Poisson process of rate ¸ if its inter- ¸ arrival times are i.i.d. Exp(¸) RVs. In this case, we denote (Tk )k 1 PP(¸). ¸ » The choice of exponential inter-arrival times is special due to the memoryless property of ex- ponential RVs (Exercise 1.6). 4 HANBAEK LYU Exercise 2.3. Let (Tk )k 1 be a Poisson process with rate ¸. Show that E[Tk ] k/¸ and Var(Tk ) ¸ Æ Æ k/¸2. Furthermore, show that T Erlang(k,¸), that is, k » k k 1 ¸z ¸ z ¡ e¡ fT (z) . (19) k Æ (k 1)! ¡ The following exercise explains what is ‘Poisson’ about the Poisson process. Exercise 2.4. Let (Tk )k 1 be a Poisson process with rate ¸ and let (N(t))t 0 be the associated count- ¸ ¸ ing process. We will show that N(t) Poisson(¸t). » (i) Using the relation {T t} {N(t) n} and Exercise 2.3, show that n · Æ ¸ Z t n n 1 ¸z ¸ z ¡ e¡ P(N(t) n) P(Tn t) dz. (20) ¸ Æ · Æ 0 (n 1)! ¡ P m ¸t (ii) Let G(t) m1 n(¸t) e¡ /m! P(Poisson(¸t) n). Show that Æ Æ Æ ¸ n n 1 ¸t d ¸ t ¡ e¡ d G(t) P(Tn t). (21) dt Æ (n 1)! Æ dt · ¡ Conclude that G(t) P(T t). Æ n · (iii) From (i) and (ii), conclude that N(t) Poisson(¸t). » 3. MOMORYLESS PROPERTY AND STOPPING TIMES Let (Tk )k 1 PP(¸) and (N(t))t 0 be its counting process. For any fixed u 0, (N(t u) N(t))t 0 ¸ » ¸ ¸ Å ¡ ¸ is a counting process itself, which counts the number of arrivals during the interval [u,u t]. We Å call this the counting process (N(t))t 0 restarted at time u. One of the remarkable properties of a ¸ Poisson process is that their counting process restarted at any given time u is again the counting process of a Poisson process, which is independent of what have happened up to time u. This is called the memoryless property of Poisson processes. Proposition 3.1 (Memoryless property of PP). Let (Tk )k 1 be a Poisson process of rate ¸ and let ¸ (N(t))t 0 be the associated counting process. ¸ (i) For any t 0, let Z (t) inf{s 0 : N(t s) N(t)} be the waiting time for the first arrival after ¸ Æ È Å È time t.
Recommended publications
  • Probability Cheatsheet V2.0 Thinking Conditionally Law of Total Probability (LOTP)
    Probability Cheatsheet v2.0 Thinking Conditionally Law of Total Probability (LOTP) Let B1;B2;B3; :::Bn be a partition of the sample space (i.e., they are Compiled by William Chen (http://wzchen.com) and Joe Blitzstein, Independence disjoint and their union is the entire sample space). with contributions from Sebastian Chiu, Yuan Jiang, Yuqi Hou, and Independent Events A and B are independent if knowing whether P (A) = P (AjB )P (B ) + P (AjB )P (B ) + ··· + P (AjB )P (B ) Jessy Hwang. Material based on Joe Blitzstein's (@stat110) lectures 1 1 2 2 n n A occurred gives no information about whether B occurred. More (http://stat110.net) and Blitzstein/Hwang's Introduction to P (A) = P (A \ B1) + P (A \ B2) + ··· + P (A \ Bn) formally, A and B (which have nonzero probability) are independent if Probability textbook (http://bit.ly/introprobability). Licensed and only if one of the following equivalent statements holds: For LOTP with extra conditioning, just add in another event C! under CC BY-NC-SA 4.0. Please share comments, suggestions, and errors at http://github.com/wzchen/probability_cheatsheet. P (A \ B) = P (A)P (B) P (AjC) = P (AjB1;C)P (B1jC) + ··· + P (AjBn;C)P (BnjC) P (AjB) = P (A) P (AjC) = P (A \ B1jC) + P (A \ B2jC) + ··· + P (A \ BnjC) P (BjA) = P (B) Last Updated September 4, 2015 Special case of LOTP with B and Bc as partition: Conditional Independence A and B are conditionally independent P (A) = P (AjB)P (B) + P (AjBc)P (Bc) given C if P (A \ BjC) = P (AjC)P (BjC).
    [Show full text]
  • 4. Stochas C Processes
    51 4. Stochasc processes Be able to formulate models for stochastic processes, and understand how they can be used for estimation and prediction. Be familiar with calculation techniques for memoryless stochastic processes. Understand the classification of discrete state space Markov chains, and be able to calculate the stationary distribution, and recognise limit theorems. Science is often concerned with the laws that describe how a system changes over time, such as Newton’s laws of motion. When we use probabilistic laws to describe how the system changes, the system is called a stochastic process. We used a stochastic process model in Section 1.2 to analyse Bitcoin; the probabilistic part of our model was the randomness of who generates the next Bitcoin block, be it the attacker or the rest of the peer-to-peer network. In Part II, you will come across stochastic process models in several courses: • in Computer Systems Modelling they are used to describe discrete event simulations of communications networks • in Machine Learning and Bayesian Inference they are used for computing posterior distributions • in Information Theory they are used to describe noisy communications channels, and also the data streams sent over such channels. 4.1. Markov chains Example 4.1. The Russian mathematician Andrei Markov (1856–1922) invented a new type of probabilistic model, now given his name, and his first application was to model Pushkin’s poem Eugeny Onegin. He suggested the following method for generating a stream of text C = (C ;C ;C ;::: ) where each C is an alphabetic character: 0 1 2 n As usual, we write C for the random 1 alphabet = [ ’a ’ , ’b ’ ,...] # a l l p o s s i b l e c h a r a c t e r s i n c l .
    [Show full text]
  • 5 Stochastic Processes
    5 Stochastic Processes Contents 5.1. The Bernoulli Process ...................p.3 5.2. The Poisson Process .................. p.15 1 2 Stochastic Processes Chap. 5 A stochastic process is a mathematical model of a probabilistic experiment that evolves in time and generates a sequence of numerical values. For example, a stochastic process can be used to model: (a) the sequence of daily prices of a stock; (b) the sequence of scores in a football game; (c) the sequence of failure times of a machine; (d) the sequence of hourly traffic loads at a node of a communication network; (e) the sequence of radar measurements of the position of an airplane. Each numerical value in the sequence is modeled by a random variable, so a stochastic process is simply a (finite or infinite) sequence of random variables and does not represent a major conceptual departure from our basic framework. We are still dealing with a single basic experiment that involves outcomes gov- erned by a probability law, and random variables that inherit their probabilistic † properties from that law. However, stochastic processes involve some change in emphasis over our earlier models. In particular: (a) We tend to focus on the dependencies in the sequence of values generated by the process. For example, how do future prices of a stock depend on past values? (b) We are often interested in long-term averages,involving the entire se- quence of generated values. For example, what is the fraction of time that a machine is idle? (c) We sometimes wish to characterize the likelihood or frequency of certain boundary events.
    [Show full text]
  • (Introduction to Probability at an Advanced Level) - All Lecture Notes
    Fall 2018 Statistics 201A (Introduction to Probability at an advanced level) - All Lecture Notes Aditya Guntuboyina August 15, 2020 Contents 0.1 Sample spaces, Events, Probability.................................5 0.2 Conditional Probability and Independence.............................6 0.3 Random Variables..........................................7 1 Random Variables, Expectation and Variance8 1.1 Expectations of Random Variables.................................9 1.2 Variance................................................ 10 2 Independence of Random Variables 11 3 Common Distributions 11 3.1 Ber(p) Distribution......................................... 11 3.2 Bin(n; p) Distribution........................................ 11 3.3 Poisson Distribution......................................... 12 4 Covariance, Correlation and Regression 14 5 Correlation and Regression 16 6 Back to Common Distributions 16 6.1 Geometric Distribution........................................ 16 6.2 Negative Binomial Distribution................................... 17 7 Continuous Distributions 17 7.1 Normal or Gaussian Distribution.................................. 17 1 7.2 Uniform Distribution......................................... 18 7.3 The Exponential Density...................................... 18 7.4 The Gamma Density......................................... 18 8 Variable Transformations 19 9 Distribution Functions and the Quantile Transform 20 10 Joint Densities 22 11 Joint Densities under Transformations 23 11.1 Detour to Convolutions......................................
    [Show full text]
  • Download English-US Transcript (PDF)
    MITOCW | MITRES6_012S18_L06-06_300k We will now work with a geometric random variable and put to use our understanding of conditional PMFs and conditional expectations. Remember that a geometric random variable corresponds to the number of independent coin tosses until the first head occurs. And here p is a parameter that describes the coin. It is the probability of heads at each coin toss. We have already seen the formula for the geometric PMF and the corresponding plot. We will now add one very important property which is usually called Memorylessness. Ultimately, this property has to do with the fact that independent coin tosses do not have any memory. Past coin tosses do not affect future coin tosses. So consider a coin-tossing experiment with independent tosses and let X be the number of tosses until the first heads. And X is a geometric with parameter p. Suppose that you show up a little after the experiment has started. And you're told that there was so far just one coin toss. And that this coin toss resulted in tails. Now you have to take over and carry out the remaining tosses until heads are observed. What should your model be about the future? Well, you will be making independent coin tosses until the first heads. So the number of such tosses will be a random variable, which is geometric with parameter p. So this duration-- as far as you are concerned-- is geometric with parameter p. Therefore, the number of remaining coin tosses starting from here-- given that the first toss was tails-- has the same geometric distribution as the original random variable X.
    [Show full text]
  • Fraud Risk Assessment Within Blockchain Transactions Pierre-Olivier Goffard
    Fraud risk assessment within blockchain transactions Pierre-Olivier Goffard To cite this version: Pierre-Olivier Goffard. Fraud risk assessment within blockchain transactions. 2019. hal-01716687v2 HAL Id: hal-01716687 https://hal.archives-ouvertes.fr/hal-01716687v2 Preprint submitted on 16 Jan 2019 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Fraud risk assessment within blockchain transactions Pierre-O. Goard∗ Univ Lyon, Université Lyon 1, LSAF EA2429. December 10, 2018 Abstract The probability of successfully spending twice the same bitcoins is considered. A double-spending attack consists in issuing two transactions transferring the same bitcoins. The rst transaction, from the fraudster to a merchant, is included in a block of the public chain. The second transaction, from the fraudster to himself, is recorded in a block that integrates a private chain, exact copy of the public chain up to substituting the fraudster-to-merchant transaction by the fraudster-to- fraudster transaction. The double-spending hack is completed once the private chain reaches the length of the public chain, in which case it replaces it. The growth of both chains are modeled by two independent counting processes.
    [Show full text]
  • A Computational Formulation of the Behavior Systems Account of the Temporal Organization of Motivated Behavior T ⁎ Federico Sanabriaa, , Carter W
    Behavioural Processes 169 (2019) 103952 Contents lists available at ScienceDirect Behavioural Processes journal homepage: www.elsevier.com/locate/behavproc A computational formulation of the behavior systems account of the temporal organization of motivated behavior T ⁎ Federico Sanabriaa, , Carter W. Danielsa,b, Tanya Guptaa, Cristina Santosc a Arizona State University, United States b Columbia University, United States c Universidade do Minho, Portugal ARTICLE INFO ABSTRACT Keywords: The behavior systems framework suggests that motivated behavior—e.g., seeking food and mates, avoiding Algorithm predators—consists of sequences of actions organized within nested behavioral states. This framework has Behavior system bridged behavioral ecology and experimental psychology, providing key insights into critical behavioral pro- Bout cesses. In particular, the behavior systems framework entails a particular organization of behavior over time. The Markov model present paper examines whether such organization emerges from a generic Markov process, where the current Reinforcement behavioral state determines the probability distribution of subsequent behavioral states. This proposition is Temporal organization developed as a systematic examination of increasingly complex Markov models, seeking a computational for- mulation that balances adherence to the behavior systems approach, parsimony, and conformity to data. As a result of this exercise, a nonstationary partially hidden Markov model is selected as a computational formulation of the predatory subsystem. It is noted that the temporal distribution of discrete responses may further unveil the structure and parameters of the model but, without proper mathematical modeling, these discrete responses may be misleading. Opportunities for further elaboration of the proposed computational formulation are identified, including developments in its architecture, extensions to defensive and reproductive subsystems, and metho- dological refinements.
    [Show full text]
  • EE 302 Division 1. Homework 10 Solutions. Y Y Y Y Y 0 T T T
    EE 302 Division 1. Homework 10 Solutions. A summary of Poisson and Bernoulli processes: 0 Y1 Y2 Y3 Y4 Y5 T1 T2 T3 T4 T5 Bernoulli (p) Poisson (¸) 8 µ ¶ ( < t ¡ 1 k t¡k ¸ktk¡1e¡¸t p (1 ¡ p) t ¸ k (k¡1)! t ¸ 0 time of the k-th arrival pY (t)= k ¡ 1 fYk (t)= k : 0 t<0 0 ½ t<k ½ (1 ¡ p)t¡1pt¸ 1 ¸e¡¸t t ¸ 0 inter-arrival time pTk (t)= fTk (t)= 8 µ ¶0 t<1 0 t<0 ½ < ¿ n ¿¡n (¸¿)n ¡¸¿ number of arrivals p (1 ¡ p) 0 · n · ¿ n! e 0 · n PS (n)= n PN¿ (n)= within ¿ units of time ¿ : 0 n<0 0 otherwise Problem 1. Fred is giving out samples of canned dog food. He makes calls door to door, but he leaves a sample (one can) only on those calls for which the door is answered and a dog is in res- idence. On any call the probability of the door being answered is 3/4, and the probability that any household has a dog is 2/3. Assume that the events \Door answered" and \A dog lives here" are independent and also that the outcomes of all calls are independent. (a) Determine the probability that Fred gives away his ¯rst sample on his third call. Solution. This is a Bernoulli process: each call is a Bernoulli trial, and the calls are independent. The probability of success p is: p = P(fthe door is answered g\fa dog is in residenceg) = P(fthe door is answeredg)P(fa dog is in residenceg) 3 2 1 = ¢ = : 4 3 2 Y1, the time of the ¯rst success is a geometric random variable with parameter p, therefore we have: 3¡1 P(Y1 =3)=pY1 (3)=(1¡ 1=2) (1=2) = 1=8: (b) Given that he has given away exactly four samples on his ¯rst eight calls, determine the conditional probability that Fred will give away his ¯fth sample on his eleventh call.
    [Show full text]
  • Twistedpair: Towards Practical Anonymity in the Bitcoin P2P Network
    TwistedPair: Towards Practical Anonymity in the Bitcoin P2P Network Abstract—Recent work has demonstrated significant anonymity vulnerabilities in Bitcoin’s networking stack. In particular, the current mechanism for broadcasting Bitcoin transactions allows third-party observers to link transactions to the IP addresses that originated them. This lays the groundwork for low-cost, large-scale deanonymization attacks. In this Fig. 1: Supernodes can observe flooding metadata to infer work, we present TwistedPair, a first-principles, theoretically- which node was the source of a transaction message (tx). justified defense against large-scale deanonymization attacks. TwistedPair is lightweight, scalable, and completely interoperable with the existing Bitcoin network. We evaluate TwistedPair through experiments on Bitcoin’s mainnet to demonstrate with an overview of Bitcoin’s P2P network, and explain why its interoperability with the current network, as well as low it enables deanonymization attacks. broadcast latency overhead. A. Bitcoin’s P2P Network I. INTRODUCTION Bitcoin nodes are connected over a P2P network of TCP links. This network is used to communicate transactions, the Anonymity is an important property for a financial system, blockchain, and control packets, and it plays a crucial role in especially given the often-sensitive nature of transactions [15]. maintaining the network’s consistency. Each peer is identified Unfortunately, the anonymity protections in Bitcoin and similar by its (IP address, port) combination. Whenever a node gen­ cryptocurrencies can be fragile. This is largely because Bitcoin erates a transaction, it broadcasts a record of the transaction users are identified by cryptographic pseudonyms (a user can over the P2P network; critically, transaction messages do not have multiple pseudonyms).
    [Show full text]
  • Markov Chain 1 Markov Chain
    Markov chain 1 Markov chain A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized as memoryless: the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of "memorylessness" is called the Markov property. Markov chains have many applications as statistical models of real-world processes. A simple two-state Markov chain Introduction Formally, a Markov chain is a random process with the Markov property. Often, the term "Markov chain" is used to mean a Markov process which has a discrete (finite or countable) state-space. Usually a Markov chain is defined for a discrete set of times (i.e., a discrete-time Markov chain)[1] although some authors use the same terminology where "time" can take continuous values.[2][3] The use of the term in Markov chain Monte Carlo methodology covers cases where the process is in discrete time (discrete algorithm steps) with a continuous state space. The following concentrates on the discrete-time discrete-state-space case. A discrete-time random process involves a system which is in a certain state at each step, with the state changing randomly between steps. The steps are often thought of as moments in time, but they can equally well refer to physical distance or any other discrete measurement; formally, the steps are the integers or natural numbers, and the random process is a mapping of these to states.
    [Show full text]
  • Lecture Notes 7 Random Processes • Definition • IID Processes
    Lecture Notes 7 Random Processes Definition • IID Processes • Bernoulli Process • Binomial Counting Process ◦ Interarrival Time Process ◦ Markov Processes • Markov Chains • Classification of States ◦ Steady State Probabilities ◦ Corresponding pages from B&T: 271–281, 313–340. EE 178/278A: Random Processes Page 7–1 Random Processes A random process (also called stochastic process) X(t): t is an infinite • collection of random variables, one for each value of{ time t ∈ T }(or, in some cases distance) ∈T Random processes are used to model random experiments that evolve in time: • Received sequence/waveform at the output of a communication channel ◦ Packet arrival times at a node in a communication network ◦ Thermal noise in a resistor ◦ Scores of an NBA team in consecutive games ◦ Daily price of a stock ◦ Winnings or losses of a gambler ◦ Earth movement around a fault line ◦ EE 178/278A: Random Processes Page 7–2 Questions Involving Random Processes Dependencies of the random variables of the process: • How do future received values depend on past received values? ◦ How do future prices of a stock depend on its past values? ◦ How well do past earth movements predict an earthquake? ◦ Long term averages: • What is the proportion of time a queue is empty? ◦ What is the average noise power generated by a resistor? ◦ Extreme or boundary events: • What is the probability that a link in a communication network is congested? ◦ What is the probability that the maximum power in a power distribution line ◦ is exceeded? What is the probability that a gambler will lose all his capital? ◦ EE 178/278A: Random Processes Page 7–3 Discrete vs.
    [Show full text]
  • Worksheet Chrysafis Vogiatzis
    Lecture 7 Worksheet Chrysafis Vogiatzis Every worksheet will work as follows. 1. You will be entered into a Zoom breakout session with other stu- dents in the class. 2. Read through the worksheet, discussing any questions with the other participants in your breakout session. • You can call me using the “Ask for help” button. • Keep in mind that I will be going through all rooms during the session so it might take me a while to get to you. 3. Answer each question (preferably in the order provided) to the best of your knowledge. 4. While collaboration between students in a breakout session is highly encouraged and expected, each student has to submit their own version. 5. You will have 24 hours (see Compass) to submit your work. Worksheet 1: Basic continuous probability distribution properties Let X be a continuous random variable measuring the current (in milliamperes, mA) in a wire with pdf f (x) = 0.05, for 0 ≤ x ≤ a. Answer the following questions. Problem 1: Valid pdf? What is a if f (x) is a valid pdf? 1 1 Recall that this means that f (x) ≥ 0 (which is clearly true here) and that Answer to Problem 1. R f (x)dx = 1 over all values that random variable X is allowed to take... lecture 7 worksheet 2 Problem 2: Constructing cumulative distribution functions What is the cumulative distribution function? 2 2 How is a cdf defined for continuous random variables? Answer to Problem 2. Problem 3: Calculating probabilities The wire is said to be overheating if the current is more than 10mA.
    [Show full text]