Markov Chains, Renewal, Branching and Coalescent Processes: Four Topics in Probability Theory

Markov Chains, Renewal, Branching and Coalescent Processes: Four Topics in Probability Theory

Markov Chains, Renewal, Branching and Coalescent Processes: Four Topics in Probability Theory Andreas Nordvall Lager˚as Mathematical Statistics Department of Mathematics Stockholm University 2007 Doctoral Dissertation 2007 Mathematical Statistics Stockholm University SE-106 91 Stockholm Typeset by LATEX c Andreas Nordvall Lager˚as ISBN 91-7155-375-4 pp. 1{14 Printed by US AB Abstract This thesis consists of four papers. In paper 1, we prove central limit theorems for Markov chains under (local) contraction conditions. As a corollary we obtain a central limit theorem for Markov chains associated with iterated function systems with contractive maps and place-dependent Dini-continuous probabilities. In paper 2, properties of inverse subordinators are investigated, in particular similarities with renewal processes. The main tool is a theorem on processes that are both renewal and Cox processes. In paper 3, distributional properties of supercritical and especially immortal branching processes are derived. The marginal distributions of immortal branching processes are found to be compound geometric. In paper 4, a description of a dynamic population model is presented, such that samples from the population have genealogies as given by a Λ-coalescent with mutations. Depending on whether the sample is grouped according to litters or families, the sampling distribution is either regenerative or non- regenerative. Tack Jag vill tacka - min handledare Thomas H¨oglund som var min f¨orsta f¨orel¨asare i san- nolikhetsteori och omedelbart fick mig att fatta tycke f¨or ¨amnet. - Anders Martin-L¨of som alltid har funnits till hands att bolla id´eer med. - Orjan¨ Stenflo som var mycket bra som en f¨orsta medf¨orfattare, ordent- ligt sporrande och tillr¨ackligt kr¨avande. - ¨ovriga kollegor p˚a matematisk statistik f¨or att vi fikar, ¨ater, undervisar och forskar s˚a bra tillsammans. S¨arskilt fikar. - min familj och v¨anner, f¨or allt det som sker utanf¨or min lilla verkstad. Stockholm 25 januari 2007 Andreas Nordvall Lager˚as Contents Introduction and summary of the four papers 1 1 Paper I 1 1.1 Markov chains as iterated function systems . 1 1.2 Limit theorems . 2 1.3 Main result . 3 2 Paper II 4 2.1 Renewal processes and beyond . 4 2.2 Cox processes . 5 2.3 Main result . 6 3 Paper III 8 3.1 Compound distributions . 8 3.2 Branching processes in continuous time . 8 3.3 Binary branching: the Yule process . 9 3.4 Main result . 10 4 Paper IV 12 4.1 Population models . 12 4.2 Main result . 14 List of papers I Lager˚as, A. N. and Stenflo, O.¨ (2005) Central limit theorems for contractive Markov chains.∗ Nonlinearity, 18(5), 1955{1965. II Lager˚as, A. N. (2005) A renewal-process-type expression for the mo- ments of inverse subordinators.y Journal of Applied Probability, 42(4), 1134{1144. III Lager˚as, A. N. and Martin-Lof,¨ A. (2006) Genealogy for super- critical branching processes.z Journal of Applied Probability, 43(4), 1066{1076. IV Lager˚as, A. N. (2006) A population model for Λ-coalescents with neutral mutations. Submitted. ∗ c 2005 IOP Publishing Ltd. y c 2005 The Applied Probability Trust z c 2006 The Applied Probability Trust Introduction and summary of the four papers This thesis consists of four papers that concern different areas in probability theory. The following pages have short summaries of each article for the non-specialist probabilist. 1 Paper I: Central limit theorems for con- tractive Markov chains The first article of this thesis is a joint work with Orjan¨ Stenflo, and was first published in Nonlinearity (2005), vol 18 no 5. It was also a part of my licentiate thesis. 1.1 Markov chains as iterated function systems Consider the following way of generating a Markov chain (Zn)n2N on some state space S. Let w be a collection of functions defined on S. Given f igi2I that Zn = zn, draw a random variable Xn on I, whose distribution may depend on zn, and let Zn+1 = wXn (Zn). In fact, any Markov chain can ∗ be described in this way, with I = [0; 1] and Xn being uniform on I and independent of Zn, but sometimes it is more natural to take Xn dependent on Zn. Example Let (Z ) N have state space 0; 1; 2; 3 , and transition matrix n n2 f g 1 1 0 2 0 2 1 1 2 0 2 0 (pij) = 0 1 3 1 0 4 0 4 B 3 0 1 0C B 4 4 C @ A which we can also describe with this picture: 3=4 Ò Ò 1=2 1=2 3=4 * * * * 89:;?>=<0 Vj 89?>1:;=<Vj 89:;?>=<2 Vj 89?>3:;=< j _ h _ h _ h B @ 1=2 1=4 1=4 ~B S W k Z ] _ a d g 1=2 Here full (dashed) arrows indicate a step up (down) modulo 4, and the double 3 1 1 (simple, half) arrowheads belong to jumps occurring with probability 4 ( 2 ; 4 ). ∗At least if S is Borel, see Proposition 7.6 in Kallenberg, O. (1997) Foundations of Modern Probability. Springer. 1 One way of generating this Markov chain is to let w (z) z 1 mod 4 # ≡ − w (z) z + 1 mod 4 " ≡ z 1 mod 4 for z = 0; 1 wl(z) − ≡ (z + 1 mod 4 for z = 2; 3 and let X1; X2; : : : be an i.i.d. sequence of random elements in ; ; such 1 1 1 f# " lg that P (Xn = ) = 4 , P (Xn = ) = 2 and P (Xn = ) = 4 for all n, and then set # " l Z = w n (Z ) = w n w n− (Z ) = = w n w (Z ): n+1 X n X ◦ X 1 n−1 · · · X ◦ · · · ◦ X1 0 If we allow X1; X2; : : : to be dependent on the values of Z1; Z2; : : : we can describe the dynamics with only w# and w", namely by letting 1 1 P (X = Z = 0; 1) = ; P (X = Z = 0; 1) = ; n # j n 2 n " j n 2 1 3 P (X = Z = 2; 3) = ; P (X = Z = 2; 3) = ; n # j n 4 n " j n 4 which is arguably more natural. In the article, we investigate properties of Markov chains with a compact state space, typically a closed and bounded subset of Rn, that are generated by a collection of contractive maps w , with I being countable, and the f igi2I probability of Xn = i given Zn = z is given by pi(z) for some functions pi(z) i2I . Such collections ( wi ; pi ) are called iterated function systems withf placeg dependent probabilities.f g f g 1.2 Limit theorems d When Zn Z, with Z having a stationary distribution for (Zn)n2N, you typically ha−v!e a law of large numbers for the Markov chain: 1 n f(Z ) a:s: E[f(Z)]; n n −! i=1 X or even a central limit theorem: 1 n f(Z ) E[f(Z)] d N(0; σ2) (1) pn k − −! Xk=1 2 where f is a function from the state space S to R. One could also want to center the summands in (1) by E[f(Zk)] instead of E[f(Z)]. A so called func- tional central limit theorem is a stronger version of a central limit theorem in which [nt] 1 f(Z ) E[f(Z)] d σB ; pn k − −! t Xk=1 where (Bt)0≤t≤1 is a standard Brownian motion. 1.3 Main result Our main result concerns Markov chains, that have contractive maps wi i2I when they are viewed as iterated function systems. Hence they aref calledg contractive Markov chains. With conditions on the smoothness of f and p we obtain a functional central limit theorem. The conditions are f igi2I such that a highly regular f allows for more \wild" pi i2I and vice versa. We also state the results with conditions on the rate off cong vergence towards the stationary distribution for the Markov chain. Often one consider chains such that the rate is, in a sense, \exponential", but our results also work with even slower convergence. 3 2 Paper II: A renewal-process-type expression for the moments of inverse subordinators This article was first published in Journal of Applied Probability (2005), vol 42 no 4, and was a part of my licentiate thesis. 2.1 Renewal processes and beyond One of the first stochastic processes that one is introduced to in a beginners course in stochastic processes is the renewal process. It is simply a collection of points in time, events of some sort, such that the times between consecutive events are independent and identically distributed. One quantity of interest is Nt = \the number of events in [0; t]." The simplest case of a renewal process is the Poisson process, which has an exponential distribution for the times between events. The name comes from the fact that Nt is Poisson distributed for all t. Calculations for the Poisson process are greatly simplified by the fact that it is a Markov process in continuous time, and the number of events in disjoint time intervals are independent. For other renewal processes, one can hardly give any exact results about the distribution of Nt. An exception to this is that if one has an explicit expression for E[Nt] it can be used to calculate moments of arbitrary integer order for the joint moments of the increments of (Nt) over disjoint intervals. It is easiest to state [k] the result with factorial moments instead of ordinary moments: E[Nt ] = E[Nt(Nt 1) (Nt k + 1)].

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    20 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us