1. Stochastic Process 2. Poisson

Total Page:16

File Type:pdf, Size:1020Kb

1. Stochastic Process 2. Poisson HETERGENEITY IN QUANTITATIVE MACROECONOMICS @ TSE OCTOBER 17, 2016 STOCHASTIC CALCULUS BASICS SANG YOON (TIM)LEE Very simple notes (need to add references). It is NOT meant to be a substitute for a real course in stochastic calculus, just listing heuristic derivations of the stuff most oftenly used in economics. Ito calculus is a lot more than only dealing with Poisson jumps and Wiener processes. Some abuses of notation included without clarification. 1. Stochastic Process A stochastic process is a collection of random variables (measurable functions) fXt : t 2 Tg , Xt : W ! S, ordered in t (time), along with a measurable space (S, S). The probability space (W, F, P) denotes, respectively, the “state space" (set of all possible histories), the s-algebra that contains all possible sets (Borel sets) of histories induced by W, and the probability mea- sure over F. The space (S, S) contains the range of the function Xt : W 7! S and its corresponding s-algebra. For example, for most of our applications Xt 2 R or R+. If Xt is measurable, any process induces a measure Pt that we can construct using the original probability space. This calls for the notion of a filtration: a weakly increasing collection of Borel sets on S, fFt, t 2 Tg, s.t. for all s < t 2 T, Fs ⊂ Ft ⊂ F. The process X is adapted to the filtration fFtgt2T if Xt is Ft-measurable. This just means that for any Xt, I can compute the probability only using Ft and not all of F. Hence, a well defined stochastic process is always adapted to its natural filtration n −1 o Ft = s Xs (A) : s ≤ t, A 2 S . This just means that for any history of Xt up to time t, all possibly realizable trajectories can be mapped backed into a subset of Ft, so that I can compute its probability for all points up to time t. This generates an induced probability measure over X. EXAMPLE 1 Let W = [0, 1]¥. Then any w 2 W is just a coordinate on the infinite dimensional unit cube. If we let Xt : W 7! S denote the t-th coordinate, S is just the unit interval [0, 1]. If we construct, say, the probability measure so that P = P1 × · · · × P¥, where each Pt is the uniform distribution, Xt is i.i.d. uniform. 2. Poisson (Jump) Process Let Nt be the random variable equal to the number of “hits" up to time t. The (adapted) state space is R[0,t] and range is all right-continuous paths that increase by 1. Now define 1 the probability measure over w as Poisson: (l(t − s])n P fN − N = ng = · exp(−l(t − s]) t s n! where l is the rate of arrival. This is what is usually called the Poisson process. (Not to be confused with what we use more often in economics: Xt is a Compound Poisson Process (CPP) if if it changes to some value at rate lt, studied below. In fact this is a new random variable in which Xt changes to some value if Nt > Ns for all s < t, and you could redefine the probability space to the histories of Nt rather than R[0,t]. This is the set of all right-continuous paths that increase by 1.) More typically, the Poisson process is defined as a counting process: DEFINITION 1 A continuous stochastic process Nt is Poisson if 1.N t is a counting process: Z+ (a)N t lives in (Z+, 2 ), for all t ≥ 0, (b)N s ≤ Nt for all s ≤ t, (c) lims#t Ns ≤ lims"t Ns for all t ≥ 0; that is, no hit can happen simultaneously. 2.N 0 = 0 a.s., 3. N is a stochastic process with stationary, independent increments The two definitions are equivalent; there are many other definitions as well but I refer you to the internet. It is easier to show that the earlier definition implies the counting process; by definition, increments are independent. The probability of getting 0, 1, or 2 or more hits in a time interval dt > 0 is P(Nt+dt − Nt = 0) = exp(−ldt) ≈ 1 − ldt + o(dt) 2 2 P(Nt+dt − Nt = 1) = ldt · exp(−ldt) ≈ ldt − l dt + o(dt) ≈ ldt 2 −ldt P(Nt+dt − Nt ≥ 2) = (ldt) · e /2 + o(dt) = o(dt). Clearly, the actual probability that something happens in any interval dt (and (t, t + dt], since the increments are independent) is 0. Conversely, one way to make sense of the counting process is to realize that stationarity implies E[N(T)/T] = lim N(T)/T = l T!¥ and instead of sending T to infinity, send the number of intervals dt in (0, T] to infinity to get that the expected number of hits in any given time interval is l: E[dNt] = E[N(dt)] ¥ = ldt = 0 · P(N(dt) = 0) + 1 · P(N(dt) = 1) + ∑ n · P(N(dt) = n) n=2 2 = P(N(dt) = 1) since two hits cannot occur at the same time. This is important later when we derive the stochastic HJB equation. 2.1 Compound Poisson Process Now define a jump process over the underlying Poisson process: Let Xt be a r.v. that is ga if Nt is even and gb if Nt is odd. Heuristically, 2 −ldt E[dXt] = 0 · exp(−ldt) + (gb − ga) · ldt · exp(−ldt) + 0 · (ldt) · exp /2 E[X˙ t] = lim (gb − ga) · l · exp(−ldt) = l(gb − ga) dt!0 More generally, let fZkgk≥1 be an i.i.d. ordered sequence of random variables with mea- sure Gz(z), independent of the Poisson process Nt. Let Xt be a continuous stochastic process that is a function of (Nt, Zk), and define Nt Xt = ∑ Zk. k=1 Then ( " ! #) ¥ n ¥ e−lt(lt)n EXt = ∑ E ∑ ZkjNt = n P(Nt = n) = ∑ n=1 k=1 n=1 n! " !# n ¥ e−lt(lt)n−1 = E Z = ltm = m · lt ∑ k Z ∑ ( − ) Z k=1 n=1 n 1 ! and Z t dXt = ZNt dNt = ZNt dNt 0 assuming X0 = 0. 2.2 Stochastic Integral with Poisson First consider a function f (Nt). The integral is easy to write as Nt Z t f (Nt) − f (0) = ∑ [ f (k) − f (k − 1)] = [ f (1 + Ns− ) − f (Ns− )] dNs k=1 0 Z t Z t = [ f (Ns) − f (Ns − 1)] dNs = [ f (Ns) − f (Ns− )] dNs, 0 0 3 where Ns− is the left limit of the Poisson process, and only one jump occurs in an dt by definition (or construction) of the Poisson process. For the compound process, recall that the waiting time for the kth hit of the Poisson process, Tk, is also a random variable s.t. that the event fTk > tg , fNt ≥ k − 1g; in particular this means that Tk − Tk−1 is an i.i.d. process by definition. For k = 1, the waiting time follows an exponential distribution. For k > 1, Z ¥ e−ls(ls)n−1 P(Tk > t) = l ds, (1) t (n − 1)! since P(Tk > t) = P(Tk > t ≥ Tk−1) + P(Tk−1 > t) Z ¥ e−ls(ls)n−2 = P(Nt = n − 1) + l ds t (n − 2)! e−lt(lt)n−1 Z ¥ e−ls(ls)n−2 = + l ds (n − 1)! t (n − 2)! and integration by parts leads to (1). Using waiting times, the stochastic integral of a function of a compound Poisson process can be written Nt Z t − − − − f (Yt) − f (0) = [ f (Y + Zk) − f (Y )] = [ f (Ys + ZNs ) − f (Ys )] dNs ∑ Tk Tk k=1 0 Z t = [ f (Ys) − f (Ys− )] dNs 0 Z t Z t = [ f (Ys) − f (Ys− )] (dNs − lds) + l [ f (Ys) − f (Ys− )] ds. 0 0 3. Wiener Process (Brownian Motion) DEFINITION 2 A Wiener process is defined by four properties: 1.W 0 = 0 a.s. 2. Independent increments: Wt − Ws is independent of Fs for all s ≤ t 3. Normality: Wt − Ws ∼ N (0, t − s) 4.W t is continuous a.s. We could spend the whole semester just talking about this, which we won’t. Basically, think of Brownian motion as a random walk in continuous time: the best predictor of dXt is 0, with Gaussian errors. So clearly, Wt is a particular type of a martingale (E[WtjFs] = Ws a.s., for all 0 ≤ s < t < ¥). 4 Most commonly you will encounter a Brownian motion with drift, a geometric Brow- nian motion, or a generic (Ito) diffusion process: dXt = mdt + sdWt, dXt = mXtdt + sXtdWt, dXt = m(Xt)dt + s(Xt)dWt the geometric Brownian motion simply gives dXt/Xt = d log Xt = mdt + sdWt, so it is the just a Brownian motion with drift in percentage points (or log-points, to be exact). In the Ito process, the instantaneous drift and variance depend on the current value of Xt and is related to the version of Ito’s Lemma that we will look at below. Before we move along, note that both the Poisson process and Brownian motion are Markov processes, but while the Brownian motion has a continuous time path a.s., the Poisson process has a discontinuous time path a.s. Also, Poisson was not a martingale, but dNt − ldt was. It will be useful to know the quadratic variation of the Brownian motion: we will use a particular formulation that exploits the CLT in discrete time: 2n−1 2 2 hWi ≡ E[Wt ] = lim [DiWt] t n!¥ ∑ i=1 D Wt ≡ W n − W n i ti+1 ti n n n where ti ≡ it/2 .
Recommended publications
  • Introduction to Lévy Processes
    Introduction to Lévy Processes Huang Lorick [email protected] Document type These are lecture notes. Typos, errors, and imprecisions are expected. Comments are welcome! This version is available at http://perso.math.univ-toulouse.fr/lhuang/enseignements/ Year of publication 2021 Terms of use This work is licensed under a Creative Commons Attribution 4.0 International license: https://creativecommons.org/licenses/by/4.0/ Contents Contents 1 1 Introduction and Examples 2 1.1 Infinitely divisible distributions . 2 1.2 Examples of infinitely divisible distributions . 2 1.3 The Lévy Khintchine formula . 4 1.4 Digression on Relativity . 6 2 Lévy processes 8 2.1 Definition of a Lévy process . 8 2.2 Examples of Lévy processes . 9 2.3 Exploring the Jumps of a Lévy Process . 11 3 Proof of the Levy Khintchine formula 19 3.1 The Lévy-Itô Decomposition . 19 3.2 Consequences of the Lévy-Itô Decomposition . 21 3.3 Exercises . 23 3.4 Discussion . 23 4 Lévy processes as Markov Processes 24 4.1 Properties of the Semi-group . 24 4.2 The Generator . 26 4.3 Recurrence and Transience . 28 4.4 Fractional Derivatives . 29 5 Elements of Stochastic Calculus with Jumps 31 5.1 Example of Use in Applications . 31 5.2 Stochastic Integration . 32 5.3 Construction of the Stochastic Integral . 33 5.4 Quadratic Variation and Itô Formula with jumps . 34 5.5 Stochastic Differential Equation . 35 Bibliography 38 1 Chapter 1 Introduction and Examples In this introductive chapter, we start by defining the notion of infinitely divisible distributions. We then give examples of such distributions and end this chapter by stating the celebrated Lévy-Khintchine formula.
    [Show full text]
  • Poisson Processes Stochastic Processes
    Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written as −λt f (t) = λe 1(0;+1)(t) We summarize the above by T ∼ exp(λ): The cumulative distribution function of a exponential random variable is −λt F (t) = P(T ≤ t) = 1 − e 1(0;+1)(t) And the tail, expectation and variance are P(T > t) = e−λt ; E[T ] = λ−1; and Var(T ) = E[T ] = λ−2 The exponential random variable has the lack of memory property P(T > t + sjT > t) = P(T > s) Exponencial races In what follows, T1;:::; Tn are independent r.v., with Ti ∼ exp(λi ). P1: min(T1;:::; Tn) ∼ exp(λ1 + ··· + λn) . P2 λ1 P(T1 < T2) = λ1 + λ2 P3: λi P(Ti = min(T1;:::; Tn)) = λ1 + ··· + λn P4: If λi = λ and Sn = T1 + ··· + Tn ∼ Γ(n; λ). That is, Sn has probability density function (λs)n−1 f (s) = λe−λs 1 (s) Sn (n − 1)! (0;+1) The Poisson Process as a renewal process Let T1; T2;::: be a sequence of i.i.d. nonnegative r.v. (interarrival times). Define the arrival times Sn = T1 + ··· + Tn if n ≥ 1 and S0 = 0: The process N(t) = maxfn : Sn ≤ tg; is called Renewal Process. If the common distribution of the times is the exponential distribution with rate λ then process is called Poisson Process of with rate λ. Lemma. N(t) ∼ Poisson(λt) and N(t + s) − N(s); t ≥ 0; is a Poisson process independent of N(s); t ≥ 0 The Poisson Process as a L´evy Process A stochastic process fX (t); t ≥ 0g is a L´evyProcess if it verifies the following properties: 1.
    [Show full text]
  • Partnership As Experimentation: Business Organization and Survival in Egypt, 1910–1949
    Yale University EliScholar – A Digital Platform for Scholarly Publishing at Yale Discussion Papers Economic Growth Center 5-1-2017 Partnership as Experimentation: Business Organization and Survival in Egypt, 1910–1949 Cihan Artunç Timothy Guinnane Follow this and additional works at: https://elischolar.library.yale.edu/egcenter-discussion-paper-series Recommended Citation Artunç, Cihan and Guinnane, Timothy, "Partnership as Experimentation: Business Organization and Survival in Egypt, 1910–1949" (2017). Discussion Papers. 1065. https://elischolar.library.yale.edu/egcenter-discussion-paper-series/1065 This Discussion Paper is brought to you for free and open access by the Economic Growth Center at EliScholar – A Digital Platform for Scholarly Publishing at Yale. It has been accepted for inclusion in Discussion Papers by an authorized administrator of EliScholar – A Digital Platform for Scholarly Publishing at Yale. For more information, please contact [email protected]. ECONOMIC GROWTH CENTER YALE UNIVERSITY P.O. Box 208269 New Haven, CT 06520-8269 http://www.econ.yale.edu/~egcenter Economic Growth Center Discussion Paper No. 1057 Partnership as Experimentation: Business Organization and Survival in Egypt, 1910–1949 Cihan Artunç University of Arizona Timothy W. Guinnane Yale University Notes: Center discussion papers are preliminary materials circulated to stimulate discussion and critical comments. This paper can be downloaded without charge from the Social Science Research Network Electronic Paper Collection: https://ssrn.com/abstract=2973315 Partnership as Experimentation: Business Organization and Survival in Egypt, 1910–1949 Cihan Artunç⇤ Timothy W. Guinnane† This Draft: May 2017 Abstract The relationship between legal forms of firm organization and economic develop- ment remains poorly understood. Recent research disputes the view that the joint-stock corporation played a crucial role in historical economic development, but retains the view that the costless firm dissolution implicit in non-corporate forms is detrimental to investment.
    [Show full text]
  • A Model of Gene Expression Based on Random Dynamical Systems Reveals Modularity Properties of Gene Regulatory Networks†
    A Model of Gene Expression Based on Random Dynamical Systems Reveals Modularity Properties of Gene Regulatory Networks† Fernando Antoneli1,4,*, Renata C. Ferreira3, Marcelo R. S. Briones2,4 1 Departmento de Informática em Saúde, Escola Paulista de Medicina (EPM), Universidade Federal de São Paulo (UNIFESP), SP, Brasil 2 Departmento de Microbiologia, Imunologia e Parasitologia, Escola Paulista de Medicina (EPM), Universidade Federal de São Paulo (UNIFESP), SP, Brasil 3 College of Medicine, Pennsylvania State University (Hershey), PA, USA 4 Laboratório de Genômica Evolutiva e Biocomplexidade, EPM, UNIFESP, Ed. Pesquisas II, Rua Pedro de Toledo 669, CEP 04039-032, São Paulo, Brasil Abstract. Here we propose a new approach to modeling gene expression based on the theory of random dynamical systems (RDS) that provides a general coupling prescription between the nodes of any given regulatory network given the dynamics of each node is modeled by a RDS. The main virtues of this approach are the following: (i) it provides a natural way to obtain arbitrarily large networks by coupling together simple basic pieces, thus revealing the modularity of regulatory networks; (ii) the assumptions about the stochastic processes used in the modeling are fairly general, in the sense that the only requirement is stationarity; (iii) there is a well developed mathematical theory, which is a blend of smooth dynamical systems theory, ergodic theory and stochastic analysis that allows one to extract relevant dynamical and statistical information without solving
    [Show full text]
  • POISSON PROCESSES 1.1. the Rutherford-Chadwick-Ellis
    POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted a series of pathbreaking experiments on radioactive decay. In one of these, a radioactive substance was observed in N = 2608 time intervals of 7.5 seconds each, and the number of decay particles reaching a counter during each period was recorded. The table below shows the number Nk of these time periods in which exactly k decays were observed for k = 0,1,2,...,9. Also shown is N pk where k pk = (3.87) exp 3.87 =k! {− g The parameter value 3.87 was chosen because it is the mean number of decays/period for Rutherford’s data. k Nk N pk k Nk N pk 0 57 54.4 6 273 253.8 1 203 210.5 7 139 140.3 2 383 407.4 8 45 67.9 3 525 525.5 9 27 29.2 4 532 508.4 10 16 17.1 5 408 393.5 ≥ This is typical of what happens in many situations where counts of occurences of some sort are recorded: the Poisson distribution often provides an accurate – sometimes remarkably ac- curate – fit. Why? 1.2. Poisson Approximation to the Binomial Distribution. The ubiquity of the Poisson distri- bution in nature stems in large part from its connection to the Binomial and Hypergeometric distributions. The Binomial-(N ,p) distribution is the distribution of the number of successes in N independent Bernoulli trials, each with success probability p.
    [Show full text]
  • Introduction to Lévy Processes
    Introduction to L´evyprocesses Graduate lecture 22 January 2004 Matthias Winkel Departmental lecturer (Institute of Actuaries and Aon lecturer in Statistics) 1. Random walks and continuous-time limits 2. Examples 3. Classification and construction of L´evy processes 4. Examples 5. Poisson point processes and simulation 1 1. Random walks and continuous-time limits 4 Definition 1 Let Yk, k ≥ 1, be i.i.d. Then n X 0 Sn = Yk, n ∈ N, k=1 is called a random walk. -4 0 8 16 Random walks have stationary and independent increments Yk = Sk − Sk−1, k ≥ 1. Stationarity means the Yk have identical distribution. Definition 2 A right-continuous process Xt, t ∈ R+, with stationary independent increments is called L´evy process. 2 Page 1 What are Sn, n ≥ 0, and Xt, t ≥ 0? Stochastic processes; mathematical objects, well-defined, with many nice properties that can be studied. If you don’t like this, think of a model for a stock price evolving with time. There are also many other applications. If you worry about negative values, think of log’s of prices. What does Definition 2 mean? Increments , = 1 , are independent and Xtk − Xtk−1 k , . , n , = 1 for all 0 = . Xtk − Xtk−1 ∼ Xtk−tk−1 k , . , n t0 < . < tn Right-continuity refers to the sample paths (realisations). 3 Can we obtain L´evyprocesses from random walks? What happens e.g. if we let the time unit tend to zero, i.e. take a more and more remote look at our random walk? If we focus at a fixed time, 1 say, and speed up the process so as to make n steps per time unit, we know what happens, the answer is given by the Central Limit Theorem: 2 Theorem 1 (Lindeberg-L´evy) If σ = V ar(Y1) < ∞, then Sn − (Sn) √E → Z ∼ N(0, σ2) in distribution, as n → ∞.
    [Show full text]
  • Chapter 1 Poisson Processes
    Chapter 1 Poisson Processes 1.1 The Basic Poisson Process The Poisson Process is basically a counting processs. A Poisson Process on the interval [0 , ∞) counts the number of times some primitive event has occurred during the time interval [0 , t]. The following assumptions are made about the ‘Process’ N(t). (i). The distribution of N(t + h) − N(t) is the same for each h> 0, i.e. is independent of t. ′ (ii). The random variables N(tj ) − N(tj) are mutually independent if the ′ intervals [tj , tj] are nonoverlapping. (iii). N(0) = 0, N(t) is integer valued, right continuous and nondecreasing in t, with Probability 1. (iv). P [N(t + h) − N(t) ≥ 2]= P [N(h) ≥ 2]= o(h) as h → 0. Theorem 1.1. Under the above assumptions, the process N(·) has the fol- lowing additional properties. (1). With probability 1, N(t) is a step function that increases in steps of size 1. (2). There exists a number λ ≥ 0 such that the distribution of N(t+s)−N(s) has a Poisson distribution with parameter λ t. 1 2 CHAPTER 1. POISSON PROCESSES (3). The gaps τ1, τ2, ··· between successive jumps are independent identically distributed random variables with the exponential distribution exp[−λ x] for x ≥ 0 P {τj ≥ x} = (1.1) (1 for x ≤ 0 Proof. Let us divide the interval [0 , T ] into n equal parts and compute the (k+1)T kT expected number of intervals with N( n ) − N( n ) ≥ 2. This expected value is equal to T 1 nP N( ) ≥ 2 = n.o( )= o(1) as n →∞ n n there by proving property (1).
    [Show full text]
  • Contents-Preface
    Stochastic Processes From Applications to Theory CHAPMAN & HA LL/CRC Texts in Statis tical Science Series Series Editors Francesca Dominici, Harvard School of Public Health, USA Julian J. Faraway, University of Bath, U K Martin Tanner, Northwestern University, USA Jim Zidek, University of Br itish Columbia, Canada Statistical !eory: A Concise Introduction Statistics for Technology: A Course in Applied F. Abramovich and Y. Ritov Statistics, !ird Edition Practical Multivariate Analysis, Fifth Edition C. Chat!eld A. A!!, S. May, and V.A. Clark Analysis of Variance, Design, and Regression : Practical Statistics for Medical Research Linear Modeling for Unbalanced Data, D.G. Altman Second Edition R. Christensen Interpreting Data: A First Course in Statistics Bayesian Ideas and Data Analysis: An A.J.B. Anderson Introduction for Scientists and Statisticians Introduction to Probability with R R. Christensen, W. Johnson, A. Branscum, K. Baclawski and T.E. Hanson Linear Algebra and Matrix Analysis for Modelling Binary Data, Second Edition Statistics D. Collett S. Banerjee and A. Roy Modelling Survival Data in Medical Research, Mathematical Statistics: Basic Ideas and !ird Edition Selected Topics, Volume I, D. Collett Second Edition Introduction to Statistical Methods for P. J. Bickel and K. A. Doksum Clinical Trials Mathematical Statistics: Basic Ideas and T.D. Cook and D.L. DeMets Selected Topics, Volume II Applied Statistics: Principles and Examples P. J. Bickel and K. A. Doksum D.R. Cox and E.J. Snell Analysis of Categorical Data with R Multivariate Survival Analysis and Competing C. R. Bilder and T. M. Loughin Risks Statistical Methods for SPC and TQM M.
    [Show full text]
  • Notes on Stochastic Processes
    Notes on stochastic processes Paul Keeler March 20, 2018 This work is licensed under a “CC BY-SA 3.0” license. Abstract A stochastic process is a type of mathematical object studied in mathemat- ics, particularly in probability theory, which can be used to represent some type of random evolution or change of a system. There are many types of stochastic processes with applications in various fields outside of mathematics, including the physical sciences, social sciences, finance and economics as well as engineer- ing and technology. This survey aims to give an accessible but detailed account of various stochastic processes by covering their history, various mathematical definitions, and key properties as well detailing various terminology and appli- cations of the process. An emphasis is placed on non-mathematical descriptions of key concepts, with recommendations for further reading. 1 Introduction In probability and related fields, a stochastic or random process, which is also called a random function, is a mathematical object usually defined as a collection of random variables. Historically, the random variables were indexed by some set of increasing numbers, usually viewed as time, giving the interpretation of a stochastic process representing numerical values of some random system evolv- ing over time, such as the growth of a bacterial population, an electrical current fluctuating due to thermal noise, or the movement of a gas molecule [120, page 7][51, page 46 and 47][66, page 1]. Stochastic processes are widely used as math- ematical models of systems and phenomena that appear to vary in a random manner. They have applications in many disciplines including physical sciences such as biology [67, 34], chemistry [156], ecology [16][104], neuroscience [102], and physics [63] as well as technology and engineering fields such as image and signal processing [53], computer science [15], information theory [43, page 71], and telecommunications [97][11][12].
    [Show full text]
  • Random Walk in Random Scenery (RWRS)
    IMS Lecture Notes–Monograph Series Dynamics & Stochastics Vol. 48 (2006) 53–65 c Institute of Mathematical Statistics, 2006 DOI: 10.1214/074921706000000077 Random walk in random scenery: A survey of some recent results Frank den Hollander1,2,* and Jeffrey E. Steif 3,† Leiden University & EURANDOM and Chalmers University of Technology Abstract. In this paper we give a survey of some recent results for random walk in random scenery (RWRS). On Zd, d ≥ 1, we are given a random walk with i.i.d. increments and a random scenery with i.i.d. components. The walk and the scenery are assumed to be independent. RWRS is the random process where time is indexed by Z, and at each unit of time both the step taken by the walk and the scenery value at the site that is visited are registered. We collect various results that classify the ergodic behavior of RWRS in terms of the characteristics of the underlying random walk (and discuss extensions to stationary walk increments and stationary scenery components as well). We describe a number of results for scenery reconstruction and close by listing some open questions. 1. Introduction Random walk in random scenery is a family of stationary random processes ex- hibiting amazingly rich behavior. We will survey some of the results that have been obtained in recent years and list some open questions. Mike Keane has made funda- mental contributions to this topic. As close colleagues it has been a great pleasure to work with him. We begin by defining the object of our study. Fix an integer d ≥ 1.
    [Show full text]
  • Compound Poisson Processes, Latent Shrinkage Priors and Bayesian
    Bayesian Analysis (2015) 10, Number 2, pp. 247–274 Compound Poisson Processes, Latent Shrinkage Priors and Bayesian Nonconvex Penalization Zhihua Zhang∗ and Jin Li† Abstract. In this paper we discuss Bayesian nonconvex penalization for sparse learning problems. We explore a nonparametric formulation for latent shrinkage parameters using subordinators which are one-dimensional L´evy processes. We particularly study a family of continuous compound Poisson subordinators and a family of discrete compound Poisson subordinators. We exemplify four specific subordinators: Gamma, Poisson, negative binomial and squared Bessel subordi- nators. The Laplace exponents of the subordinators are Bernstein functions, so they can be used as sparsity-inducing nonconvex penalty functions. We exploit these subordinators in regression problems, yielding a hierarchical model with multiple regularization parameters. We devise ECME (Expectation/Conditional Maximization Either) algorithms to simultaneously estimate regression coefficients and regularization parameters. The empirical evaluation of simulated data shows that our approach is feasible and effective in high-dimensional data analysis. Keywords: nonconvex penalization, subordinators, latent shrinkage parameters, Bernstein functions, ECME algorithms. 1 Introduction Variable selection methods based on penalty theory have received great attention in high-dimensional data analysis. A principled approach is due to the lasso of Tibshirani (1996), which uses the ℓ1-norm penalty. Tibshirani (1996) also pointed out that the lasso estimate can be viewed as the mode of the posterior distribution. Indeed, the ℓ1 penalty can be transformed into the Laplace prior. Moreover, this prior can be expressed as a Gaussian scale mixture. This has thus led to Bayesian developments of the lasso and its variants (Figueiredo, 2003; Park and Casella, 2008; Hans, 2009; Kyung et al., 2010; Griffin and Brown, 2010; Li and Lin, 2010).
    [Show full text]
  • Generalized Bernoulli Process with Long-Range Dependence And
    Depend. Model. 2021; 9:1–12 Research Article Open Access Jeonghwa Lee* Generalized Bernoulli process with long-range dependence and fractional binomial distribution https://doi.org/10.1515/demo-2021-0100 Received October 23, 2020; accepted January 22, 2021 Abstract: Bernoulli process is a nite or innite sequence of independent binary variables, Xi , i = 1, 2, ··· , whose outcome is either 1 or 0 with probability P(Xi = 1) = p, P(Xi = 0) = 1 − p, for a xed constant p 2 (0, 1). We will relax the independence condition of Bernoulli variables, and develop a generalized Bernoulli process that is stationary and has auto-covariance function that obeys power law with exponent 2H − 2, H 2 (0, 1). Generalized Bernoulli process encompasses various forms of binary sequence from an independent binary sequence to a binary sequence that has long-range dependence. Fractional binomial random variable is dened as the sum of n consecutive variables in a generalized Bernoulli process, of particular interest is when its variance is proportional to n2H , if H 2 (1/2, 1). Keywords: Bernoulli process, Long-range dependence, Hurst exponent, over-dispersed binomial model MSC: 60G10, 60G22 1 Introduction Fractional process has been of interest due to its usefulness in capturing long lasting dependency in a stochas- tic process called long-range dependence, and has been rapidly developed for the last few decades. It has been applied to internet trac, queueing networks, hydrology data, etc (see [3, 7, 10]). Among the most well known models are fractional Gaussian noise, fractional Brownian motion, and fractional Poisson process. Fractional Brownian motion(fBm) BH(t) developed by Mandelbrot, B.
    [Show full text]