Heat Kernels and Exit Times for Isotropic Lévy Processes

Total Page:16

File Type:pdf, Size:1020Kb

Heat Kernels and Exit Times for Isotropic Lévy Processes Heat kernels and exit times for isotropic L´evy processes Heat kernels and exit times for isotropic Levy´ processes MichałRyznar Wrocław University of Technology Third FCPNLO Workshop Fractional Calculus, Probability and Non-Local Operators BCAM, Bilbao, November 18-20, 2015 The presentation is based mostly on a joint project with K. Bogdan, T. Grzywny. Heat kernels and exit times for isotropic L´evy processes Definitions A (Borel) measure on Rd is called isotropic unimodal, if on Rd 0 it is \{ } absolutely continuous with respect to the Lebesgue measure, and has a finite radial nonincreasing density function. Heat kernels and exit times for isotropic L´evy processes Definitions A (Borel) measure on Rd is called isotropic unimodal, if on Rd 0 it is \{ } absolutely continuous with respect to the Lebesgue measure, and has a finite radial nonincreasing density function. d AL´evy process X =(Xt,t! 0) is a stochastic process with values in R , which has stationary and idependent increments and cad-lag paths: Xt Xt ,...,Xt Xt are independent for any 0 " t0 " ..." tn. 1 − 0 n − n−1 Xt+h Xt has the same distribution as Xh,t,h>0 − Heat kernels and exit times for isotropic L´evy processes Definitions A (Borel) measure on Rd is called isotropic unimodal, if on Rd 0 it is \{ } absolutely continuous with respect to the Lebesgue measure, and has a finite radial nonincreasing density function. d AL´evy process X =(Xt,t! 0) is a stochastic process with values in R , which has stationary and idependent increments and cad-lag paths: Xt Xt ,...,Xt Xt are independent for any 0 " t0 " ..." tn. 1 − 0 n − n−1 Xt+h Xt has the same distribution as Xh,t,h>0 − X =(Xt,t! 0) is called unimodal isotropic if all its one-dimensional distributions pt(dx) are isotropic unimodal. Heat kernels and exit times for isotropic L´evy processes Definitions A (Borel) measure on Rd is called isotropic unimodal, if on Rd 0 it is \{ } absolutely continuous with respect to the Lebesgue measure, and has a finite radial nonincreasing density function. d AL´evy process X =(Xt,t! 0) is a stochastic process with values in R , which has stationary and idependent increments and cad-lag paths: Xt Xt ,...,Xt Xt are independent for any 0 " t0 " ..." tn. 1 − 0 n − n−1 Xt+h Xt has the same distribution as Xh,t,h>0 − X =(Xt,t! 0) is called unimodal isotropic if all its one-dimensional distributions pt(dx) are isotropic unimodal. We call the transition density pt(x, y)=pt(y x)=p(t, x, y) the heat − kernel. Ttf(x)=f(x + y)pt(y)dy form a semigroup on various function spaces. Heat kernels and exit times for isotropic L´evy processes Characteristic function i ξ,Xt i ξ,x tψ(ξ) d E e ⟨ ⟩ = e ⟨ ⟩pt(dx)=e− ,ξR . ∈ !Rd Isotropic unimodal L´evy processes are characterized by L´evy-Kchintchine exponents of the form ([Watanabe 1983]) ψ(ξ)=σ2 ξ 2 + (1 cos ξ,x ) ν( x )dx, (1) | | − ⟨ ⟩ | | !Rd with nonincreasing ν and σ ! 0. N(dx)=ν( x )dx is called the L´evy | | measure of the process. pt(dx) N(dx)=limt 0 → t Heat kernels and exit times for isotropic L´evy processes Characteristic function i ξ,Xt i ξ,x tψ(ξ) d E e ⟨ ⟩ = e ⟨ ⟩pt(dx)=e− ,ξR . ∈ !Rd Isotropic unimodal L´evy processes are characterized by L´evy-Kchintchine exponents of the form ([Watanabe 1983]) ψ(ξ)=σ2 ξ 2 + (1 cos ξ,x ) ν( x )dx, (1) | | − ⟨ ⟩ | | !Rd with nonincreasing ν and σ ! 0. N(dx)=ν( x )dx is called the L´evy | | measure of the process. pt(dx) N(dx)=limt 0 → t tψ(ξ) 1 If e− L then ∈ 1 i ξ,x tψ(x) p (ξ)= e− ⟨ ⟩e− dx " p (0). t (2π)d t !Rd Heat kernels and exit times for isotropic L´evy processes Examples d α Isotropic stable processes: ν(x)=c x − − | | d α Truncated α-stable processes: ν(r)=cr− − 1(0,1)(r). Heat kernels and exit times for isotropic L´evy processes Examples d α Isotropic stable processes: ν(x)=c x − − | | d α Truncated α-stable processes: ν(r)=cr− − 1(0,1)(r). Subordinate Brownian motions: Let Tt be a subordinator (positive and increasing one-dimensional L´evy λT tϕ(λ) process) without drift; Ee− t = e− ,where(Bernstein function) ∞ λs ϕ(λ)= (1 e− )µ(ds). − !0 Let B(t) be an independent Brownian motion with ψ(ξ)= ξ 2.Then | | B(Tt) has an absolutely continuous L´evy measure N(dx)=ν(x)dx,where ∞ ν(x)= gs(x)µ(ds), !0 |x|2 d/2 and gs(x)=(4πs)− e− 4s is the Gaussian kernel. Heat kernels and exit times for isotropic L´evy processes Generators f(x)=σ2∆+p.v. (f(x + y) f(x)) ν( x )dx, f C2(Rd) A − | | ∈ !Rd Heat kernels and exit times for isotropic L´evy processes Generators f(x)=σ2∆+p.v. (f(x + y) f(x)) ν( x )dx, f C2(Rd) A − | | ∈ !Rd Fractional Laplacian: α-stable isoperimetric process α/2 1 ∆ f(x)=p.v. cd,α (f(x + y) f(x)) dx − x d+α !Rd | | Heat kernels and exit times for isotropic L´evy processes Generators f(x)=σ2∆+p.v. (f(x + y) f(x)) ν( x )dx, f C2(Rd) A − | | ∈ !Rd Fractional Laplacian: α-stable isoperimetric process α/2 1 ∆ f(x)=p.v. cd,α (f(x + y) f(x)) dx − x d+α !Rd | | = I (I ∆)α/2: relativistic α-stable isoperimetric process A − − K(d+α)/2( x ) f(x)=p.v. cd,α (f(x + y) f(x)) | | dx A − x (d+α)/2 !Rd | | Heat kernels and exit times for isotropic L´evy processes Topics Heat kernels estimates and weak scaling. Heat kernels and exit times for isotropic L´evy processes Topics Heat kernels estimates and weak scaling. Exit times. Estimates of the mean exit time and surviaval probability for smooth sets. Heat kernels and exit times for isotropic L´evy processes Topics Heat kernels estimates and weak scaling. Exit times. Estimates of the mean exit time and surviaval probability for smooth sets. Heat kernels for the killed process on exiting a set. Estimates. Heat kernels and exit times for isotropic L´evy processes Topics Heat kernels estimates and weak scaling. Exit times. Estimates of the mean exit time and surviaval probability for smooth sets. Heat kernels for the killed process on exiting a set. Estimates. Hitting times. Heat kernels and exit times for isotropic L´evy processes Our goal: Estimates for pt(x) in terms of the symbol ψ of the process (or ψ∗) Heat kernels and exit times for isotropic L´evy processes Our goal: Estimates for pt(x) in terms of the symbol ψ of the process (or ψ∗) Example: Cauchy, isotropic 1-stable process, ψ(x)= x . | | t d pt(x)=Cd for x R . (t2 + x 2)(d+1)/2 ∈ | | Heat kernels and exit times for isotropic L´evy processes Our goal: Estimates for pt(x) in terms of the symbol ψ of the process (or ψ∗) Example: Cauchy, isotropic 1-stable process, ψ(x)= x . | | t d pt(x)=Cd for x R . (t2 + x 2)(d+1)/2 ∈ | | Below f g means that there is a constant c>0 such that 1 ≈ c− g " f(x) " cg(x).Thisc is called the comparability constant. Heat kernels and exit times for isotropic L´evy processes Our goal: Estimates for pt(x) in terms of the symbol ψ of the process (or ψ∗) Example: Cauchy, isotropic 1-stable process, ψ(x)= x . | | t d pt(x)=Cd for x R . (t2 + x 2)(d+1)/2 ∈ | | Below f g means that there is a constant c>0 such that 1 ≈ c− g " f(x) " cg(x).Thisc is called the comparability constant. Example: isotropic α-stable process, ψ(x)= x α, 0 <α<2. | | d/α t d pt(x) min t− , for x R . ≈ { x d+α } ∈ | | Heat kernels and exit times for isotropic L´evy processes Our goal: Estimates for pt(x) in terms of the symbol ψ of the process (or ψ∗) Example: Cauchy, isotropic 1-stable process, ψ(x)= x . | | t d pt(x)=Cd for x R . (t2 + x 2)(d+1)/2 ∈ | | Below f g means that there is a constant c>0 such that 1 ≈ c− g " f(x) " cg(x).Thisc is called the comparability constant. Example: isotropic α-stable process, ψ(x)= x α, 0 <α<2. | | d/α t d pt(x) min t− , for x R . ≈ { x d+α } ∈ | | d/α 1 d t tψ(1/ x ) t− = ψ− (1/t) , = | | = ctν(x) x d+α x d | | | | " # Heat kernels and exit times for isotropic L´evy processes Our goal: Estimates for pt(x) in terms of the symbol ψ of the process (or ψ∗) Example: Cauchy, isotropic 1-stable process, ψ(x)= x . | | t d pt(x)=Cd for x R . (t2 + x 2)(d+1)/2 ∈ | | Below f g means that there is a constant c>0 such that 1 ≈ c− g " f(x) " cg(x).Thisc is called the comparability constant. Example: isotropic α-stable process, ψ(x)= x α, 0 <α<2. | | d/α t d pt(x) min t− , for x R . ≈ { x d+α } ∈ | | d/α 1 d t tψ(1/ x ) t− = ψ− (1/t) , = | | = ctν(x) x d+α x d | | | | " # 1 d tψ(1/ x ) pt(x) min ψ− (1/t) , | | ≈ x d $ | | ' % & Heat kernels and exit times for isotropic L´evy processes Recall pt(x) is radially decreasing.
Recommended publications
  • Introduction to Lévy Processes
    Introduction to Lévy Processes Huang Lorick [email protected] Document type These are lecture notes. Typos, errors, and imprecisions are expected. Comments are welcome! This version is available at http://perso.math.univ-toulouse.fr/lhuang/enseignements/ Year of publication 2021 Terms of use This work is licensed under a Creative Commons Attribution 4.0 International license: https://creativecommons.org/licenses/by/4.0/ Contents Contents 1 1 Introduction and Examples 2 1.1 Infinitely divisible distributions . 2 1.2 Examples of infinitely divisible distributions . 2 1.3 The Lévy Khintchine formula . 4 1.4 Digression on Relativity . 6 2 Lévy processes 8 2.1 Definition of a Lévy process . 8 2.2 Examples of Lévy processes . 9 2.3 Exploring the Jumps of a Lévy Process . 11 3 Proof of the Levy Khintchine formula 19 3.1 The Lévy-Itô Decomposition . 19 3.2 Consequences of the Lévy-Itô Decomposition . 21 3.3 Exercises . 23 3.4 Discussion . 23 4 Lévy processes as Markov Processes 24 4.1 Properties of the Semi-group . 24 4.2 The Generator . 26 4.3 Recurrence and Transience . 28 4.4 Fractional Derivatives . 29 5 Elements of Stochastic Calculus with Jumps 31 5.1 Example of Use in Applications . 31 5.2 Stochastic Integration . 32 5.3 Construction of the Stochastic Integral . 33 5.4 Quadratic Variation and Itô Formula with jumps . 34 5.5 Stochastic Differential Equation . 35 Bibliography 38 1 Chapter 1 Introduction and Examples In this introductive chapter, we start by defining the notion of infinitely divisible distributions. We then give examples of such distributions and end this chapter by stating the celebrated Lévy-Khintchine formula.
    [Show full text]
  • A Mathematical Theory of Network Interference and Its Applications
    INVITED PAPER A Mathematical Theory of Network Interference and Its Applications A unifying framework is developed to characterize the aggregate interference in wireless networks, and several applications are presented. By Moe Z. Win, Fellow IEEE, Pedro C. Pinto, Student Member IEEE, and Lawrence A. Shepp ABSTRACT | In this paper, we introduce a mathematical I. INTRODUCTION framework for the characterization of network interference in In a wireless network composed of many spatially wireless systems. We consider a network in which the scattered nodes, communication is constrained by various interferers are scattered according to a spatial Poisson process impairments such as the wireless propagation effects, and are operating asynchronously in a wireless environment network interference, and thermal noise. The effects subject to path loss, shadowing, and multipath fading. We start introduced by propagation in the wireless channel include by determining the statistical distribution of the aggregate the attenuation of radiated signals with distance (path network interference. We then investigate four applications of loss), the blocking of signals caused by large obstacles the proposed model: 1) interference in cognitive radio net- (shadowing), and the reception of multiple copies of the works; 2) interference in wireless packet networks; 3) spectrum same transmitted signal (multipath fading). The network of the aggregate radio-frequency emission of wireless net- interference is due to accumulation of signals radiated by works; and 4) coexistence between ultrawideband and nar- other transmitters, which undesirably affect receiver nodes rowband systems. Our framework accounts for all the essential in the network. The thermal noise is introduced by the physical parameters that affect network interference, such as receiver electronics and is usually modeled as additive the wireless propagation effects, the transmission technology, white Gaussian noise (AWGN).
    [Show full text]
  • Introduction to Lévy Processes
    Introduction to L´evyprocesses Graduate lecture 22 January 2004 Matthias Winkel Departmental lecturer (Institute of Actuaries and Aon lecturer in Statistics) 1. Random walks and continuous-time limits 2. Examples 3. Classification and construction of L´evy processes 4. Examples 5. Poisson point processes and simulation 1 1. Random walks and continuous-time limits 4 Definition 1 Let Yk, k ≥ 1, be i.i.d. Then n X 0 Sn = Yk, n ∈ N, k=1 is called a random walk. -4 0 8 16 Random walks have stationary and independent increments Yk = Sk − Sk−1, k ≥ 1. Stationarity means the Yk have identical distribution. Definition 2 A right-continuous process Xt, t ∈ R+, with stationary independent increments is called L´evy process. 2 Page 1 What are Sn, n ≥ 0, and Xt, t ≥ 0? Stochastic processes; mathematical objects, well-defined, with many nice properties that can be studied. If you don’t like this, think of a model for a stock price evolving with time. There are also many other applications. If you worry about negative values, think of log’s of prices. What does Definition 2 mean? Increments , = 1 , are independent and Xtk − Xtk−1 k , . , n , = 1 for all 0 = . Xtk − Xtk−1 ∼ Xtk−tk−1 k , . , n t0 < . < tn Right-continuity refers to the sample paths (realisations). 3 Can we obtain L´evyprocesses from random walks? What happens e.g. if we let the time unit tend to zero, i.e. take a more and more remote look at our random walk? If we focus at a fixed time, 1 say, and speed up the process so as to make n steps per time unit, we know what happens, the answer is given by the Central Limit Theorem: 2 Theorem 1 (Lindeberg-L´evy) If σ = V ar(Y1) < ∞, then Sn − (Sn) √E → Z ∼ N(0, σ2) in distribution, as n → ∞.
    [Show full text]
  • Fclts for the Quadratic Variation of a CTRW and for Certain Stochastic Integrals
    FCLTs for the Quadratic Variation of a CTRW and for certain stochastic integrals No`eliaViles Cuadros (joint work with Enrico Scalas) Universitat de Barcelona Sevilla, 17 de Septiembre 2013 Damped harmonic oscillator subject to a random force The equation of motion is informally given by x¨(t) + γx_(t) + kx(t) = ξ(t); (1) where x(t) is the position of the oscillating particle with unit mass at time t, γ > 0 is the damping coefficient, k > 0 is the spring constant and ξ(t) represents white L´evynoise. I. M. Sokolov, Harmonic oscillator under L´evynoise: Unexpected properties in the phase space. Phys. Rev. E. Stat. Nonlin Soft Matter Phys 83, 041118 (2011). 2 of 27 The formal solution is Z t x(t) = F (t) + G(t − t0)ξ(t0)dt0; (2) −∞ where G(t) is the Green function for the homogeneous equation. The solution for the velocity component can be written as Z t 0 0 0 v(t) = Fv (t) + Gv (t − t )ξ(t )dt ; (3) −∞ d d where Fv (t) = dt F (t) and Gv (t) = dt G(t). 3 of 27 • Replace the white noise with a sequence of instantaneous shots of random amplitude at random times. • They can be expressed in terms of the formal derivative of compound renewal process, a random walk subordinated to a counting process called continuous-time random walk. A continuous time random walk (CTRW) is a pure jump process given by a sum of i.i.d. random jumps fYi gi2N separated by i.i.d. random waiting times (positive random variables) fJi gi2N.
    [Show full text]
  • Levy Processes
    LÉVY PROCESSES, STABLE PROCESSES, AND SUBORDINATORS STEVEN P.LALLEY 1. DEFINITIONSAND EXAMPLES d Definition 1.1. A continuous–time process Xt = X(t ) t 0 with values in R (or, more generally, in an abelian topological groupG ) isf called a Lévyg ≥ process if (1) its sample paths are right-continuous and have left limits at every time point t , and (2) it has stationary, independent increments, that is: (a) For all 0 = t0 < t1 < < tk , the increments X(ti ) X(ti 1) are independent. − (b) For all 0 s t the··· random variables X(t ) X−(s ) and X(t s ) X(0) have the same distribution.≤ ≤ − − − The default initial condition is X0 = 0. A subordinator is a real-valued Lévy process with nondecreasing sample paths. A stable process is a real-valued Lévy process Xt t 0 with ≥ initial value X0 = 0 that satisfies the self-similarity property f g 1/α (1.1) Xt =t =D X1 t > 0. 8 The parameter α is called the exponent of the process. Example 1.1. The most fundamental Lévy processes are the Wiener process and the Poisson process. The Poisson process is a subordinator, but is not stable; the Wiener process is stable, with exponent α = 2. Any linear combination of independent Lévy processes is again a Lévy process, so, for instance, if the Wiener process Wt and the Poisson process Nt are independent then Wt Nt is a Lévy process. More important, linear combinations of independent Poisson− processes are Lévy processes: these are special cases of what are called compound Poisson processes: see sec.
    [Show full text]
  • Financial Modeling with L´Evy Processes and Applying L
    FINANCIAL MODELING WITH LEVY´ PROCESSES AND APPLYING LEVY´ SUBORDINATOR TO CURRENT STOCK DATA by GONSALGE ALMEIDA Submitted in partial fullfillment of the requirements for the degree of Doctor of Philosophy Dissertation Advisor: Dr. Wojbor A. Woyczynski Department of Mathematics, Applied Mathematics and Statistics CASE WESTERN RESERVE UNIVERSITY January 2020 CASE WESTERN RESERVE UNIVERSITY SCHOOL OF GRADUATE STUDIES We hereby approve the dissertation of Gonsalge Almeida candidate for the Doctoral of Philosophy degree Committee Chair: Dr.Wojbor Woyczynski Professor, Department of the Mathematics, Applied Mathematics and Statis- tics Committee: Dr.Alethea Barbaro Associate Professor, Department of the Mathematics, Applied Mathematics and Statistics Committee: Dr.Jenny Brynjarsdottir Associate Professor, Department of the Mathematics, Applied Mathematics and Statistics Committee: Dr.Peter Ritchken Professor, Weatherhead School of Management Acceptance date: June 14, 2019 *We also certify that written approval has been obtained for any proprietary material contained therein. CONTENTS List of Figures iv List of Tables ix Introduction . .1 1 Financial Modeling with L´evyProcesses and Infinitely Divisible Distributions 5 1.1 Introduction . .5 1.2 Preliminaries on L´evyprocesses . .6 1.3 Characteristic Functions . .8 1.4 Cumulant Generating Function . .9 1.5 α−Stable Distributions . 10 1.6 Tempered Stable Distribution and Process . 19 1.6.1 Tempered Stable Diffusion and Super-Diffusion . 23 1.7 Numerical Approximation of Stable and Tempered Stable Sample Paths 28 1.8 Monte Carlo Simulation for Tempered α−Stable L´evyprocess . 34 2 Brownian Subordination (Tempered Stable Subordinator) 44 i 2.1 Introduction . 44 2.2 Tempered Anomalous Subdiffusion . 46 2.3 Subordinators . 49 2.4 Time-Changed Brownian Motion .
    [Show full text]
  • Models Beyond the Dirichlet Process
    Models beyond the Dirichlet process Antonio Lijoi Igor Prünster No. 129 December 2009 www.carloalberto.org/working_papers © 2009 by Antonio Lijoi and Igor Prünster. Any opinions expressed here are those of the authors and not those of the Collegio Carlo Alberto. Models beyond the Dirichlet process Antonio Lijoi1 and Igor Prunster¨ 2 1 Dipartimento Economia Politica e Metodi Quantitatavi, Universit`adegli Studi di Pavia, via San Felice 5, 27100 Pavia, Italy and CNR{IMATI, via Bassini 15, 20133 Milano. E-mail: [email protected] 2 Dipartimento di Statistica e Matematica Applicata, Collegio Carlo Alberto and ICER, Universit`a degli Studi di Torino, Corso Unione Sovietica 218/bis, 10134 Torino, Italy. E-mail: [email protected] September 2009 Abstract. Bayesian nonparametric inference is a relatively young area of research and it has recently undergone a strong development. Most of its success can be explained by the considerable degree of flexibility it ensures in statistical modelling, if compared to parametric alternatives, and by the emergence of new and efficient simulation techniques that make nonparametric models amenable to concrete use in a number of applied sta- tistical problems. Since its introduction in 1973 by T.S. Ferguson, the Dirichlet process has emerged as a cornerstone in Bayesian nonparametrics. Nonetheless, in some cases of interest for statistical applications the Dirichlet process is not an adequate prior choice and alternative nonparametric models need to be devised. In this paper we provide a review of Bayesian nonparametric models that go beyond the Dirichlet process. 1 Introduction Bayesian nonparametric inference is a relatively young area of research and it has recently under- gone a strong development.
    [Show full text]
  • Final Report (PDF)
    Foundation of Stochastic Analysis Krzysztof Burdzy (University of Washington) Zhen-Qing Chen (University of Washington) Takashi Kumagai (Kyoto University) September 18-23, 2011 1 Scientific agenda of the conference Over the years, the foundations of stochastic analysis included various specific topics, such as the general theory of Markov processes, the general theory of stochastic integration, the theory of martingales, Malli- avin calculus, the martingale-problem approach to Markov processes, and the Dirichlet form approach to Markov processes. To create some focus for the very broad topic of the conference, we chose a few areas of concentration, including • Dirichlet forms • Analysis on fractals and percolation clusters • Jump type processes • Stochastic partial differential equations and measure-valued processes Dirichlet form theory provides a powerful tool that connects the probabilistic potential theory and ana- lytic potential theory. Recently Dirichlet forms found its use in effective study of fine properties of Markov processes on spaces with minimal smoothness, such as reflecting Brownian motion on non-smooth domains, Brownian motion and jump type processes on Euclidean spaces and fractals, and Markov processes on trees and graphs. It has been shown that Dirichlet form theory is an important tool in study of various invariance principles, such as the invariance principle for reflected Brownian motion on domains with non necessarily smooth boundaries and the invariance principle for Metropolis algorithm. Dirichlet form theory can also be used to study a certain type of SPDEs. Fractals are used as an approximation of disordered media. The analysis on fractals is motivated by the desire to understand properties of natural phenomena such as polymers, and growth of molds and crystals.
    [Show full text]
  • Modern Bayesian Nonparametrics
    Modern Bayesian Nonparametrics Peter Orbanz Yee Whye Teh Cambridge University and Columbia University Gatsby Computational Neuroscience Unit, UCL NIPS 2011 Peter Orbanz & Yee Whye Teh 1 / 72 OVERVIEW 1. Nonparametric Bayesian models 2. Regression 3. Clustering 4. Applications Coffee refill break 5. Asymptotics 6. Exchangeability 7. Latent feature models 8. Dirichlet process 9. Completely random measures 10. Summary Peter Orbanz & Yee Whye Teh 2 / 72 C. E. Rasmussen & C. K. I. Williams, Gaussian Processes for Machine Learning, the MIT Press, 2006, ISBN 026218253X. c 2006 Massachusetts Institute of Technology. www.GaussianProcess.org/gpml 20 Regression 3 PARAMETERS AND PATTERNS2 1 0 Parameters output, y −1 −2 P(Xjθ) =−3 Probability[datajpattern] −5 0 5 input, x (a), =1 3 3 2 2 1 1 0 0 output, y −1 output, y −1 −2 −2 −3 −3 −5 0 5 −5 0 5 input, x input, x (b), =0.3 (c), =3 Figure 2.5: (a) Data is generated from a GP with hyperparameters (, σf , σn)= Inference idea(1, 1, 0.1), as shown by the + symbols. Using Gaussian process prediction with these hyperparameters we obtain a 95% confidence region for the underlying function f (shown in grey). Panels (b) and (c) again show the 95% confidence region, but this timedata for hyperparameter = underlying values pattern (0.3, 1.08, 0 +.00005) independent and (3.0, 1.16, noise0.89) respectively. The covariance is denoted ky as it is for the noisy targets y rather than for the 2 Peter Orbanz & Yee Whye Teh underlying function f. Observe that the length-scale , the signal variance σf 3 / 72 2 hyperparameters and the noise variance σn can be varied.
    [Show full text]
  • Multifractal Analysis of Superprocesses with Stable
    The Annals of Probability 2015, Vol. 43, No. 5, 2763–2809 DOI: 10.1214/14-AOP951 c Institute of Mathematical Statistics, 2015 MULTIFRACTAL ANALYSIS OF SUPERPROCESSES WITH STABLE BRANCHING IN DIMENSION ONE By Leonid Mytnik1 and Vitali Wachtel2 Technion—Israel Institute of Technology and University of Munich We show that density functions of a (α, 1, β)-superprocesses are almost sure multifractal for α > β + 1, β ∈ (0, 1) and calculate the corresponding spectrum of singularities. 1. Introduction, main results and discussion. For 0 < α ≤ 2 and 1+ β ∈ d (1, 2], the so-called (α, d, β)-superprocess X = {Xt : t ≥ 0} in R is a finite measure-valued process related to the log-Laplace equation d (1.1) u =∆ u + au − bu1+β, dt α where a ∈ R and b> 0 are any fixed constants. Its underlying motion is de- α/2 scribed by the fractional Laplacian ∆α := −(−∆) determining a symmet- ric α-stable motion in Rd of index α ∈ (0, 2] (Brownian motion corresponds to α = 2), whereas its continuous-state branching mechanism (1.2) v 7→ −av + bv1+β, v ≥ 0, belongs to the domain of attraction of a stable law of index 1 + β ∈ (1, 2] (the branching mechanism is critical if a = 0). α Let d< β . Then, for any fixed time t, Xt(dx) is a.s. absolutely continuous with respect to the Lebesgue measure (cf. Fleischmann [3] for a = 0). In the case of β = 1, there is a continuous version of the density of Xt(dx) for all α> 1; see Konno and Shiga [12].
    [Show full text]
  • BEJ 22 4.Pdf
    Official Journal of the Bernoulli Society for Mathematical Statistics and Probability Volume Twenty Two Number Four November 2016 ISSN: 1350-7265 CONTENTS Papers MALLER, R.A. 1963 Conditions for a Lévy process to stay positive near 0, in probability LACAUX, C. and SAMORODNITSKY, G. 1979 Time-changed extremal process as a random sup measure BISCIO, C.A.N. and LAVANCIER, F. 2001 Quantifying repulsiveness of determinantal point processes SHAO, Q.-M. and ZHOU, W.-X. 2029 Cramér type moderate deviation theorems for self-normalized processes WANG, M. and MARUYAMA, Y. 2080 Consistency of Bayes factor for nonnested model selection when the model dimension grows LANCONELLI, A. and STAN, A.I. 2101 A note on a local limit theorem for Wiener space valued random variables HUCKEMANN, S., KIM, K.-R., MUNK, A., REHFELDT, F., 2113 SOMMERFELD, M., WEICKERT, J. and WOLLNIK, C. The circular SiZer, inferred persistence of shape parameters and application to early stem cell differentiation DÖRING, H., FARAUD, G. and KÖNIG, W. 2143 Connection times in large ad-hoc mobile networks DELYON, B. and PORTIER, F. 2177 Integral approximation by kernel smoothing CORTINES, A. 2209 The genealogy of a solvable population model under selection with dynamics related to directed polymers ARNAUDON, M. and MICLO, L. 2237 A stochastic algorithm finding p-means on the circle HEAUKULANI, C. and ROY, D.M. 2301 The combinatorial structure of beta negative binomial processes (continued) The papers published in Bernoulli are indexed or abstracted in Current Index to Statistics, Mathematical Reviews, Statistical Theory and Method Abstracts-Zentralblatt (STMA-Z), and Zentralblatt für Mathematik (also avalaible on the MATH via STN database and Compact MATH CD-ROM).
    [Show full text]
  • Random Walks at Random Times: Convergence to Iterated Lévy
    The Annals of Probability 2013, Vol. 41, No. 4, 2682–2708 DOI: 10.1214/12-AOP770 c Institute of Mathematical Statistics, 2013 RANDOM WALKS AT RANDOM TIMES: CONVERGENCE TO ITERATED LEVY´ MOTION, FRACTIONAL STABLE MOTIONS, AND OTHER SELF-SIMILAR PROCESSES By Paul Jung1 and Greg Markowsky2 University of Alabama at Birmingham and Monash University For a random walk defined for a doubly infinite sequence of times, we let the time parameter itself be an integer-valued process, and call the orginal process a random walk at random time. We find the scaling limit which generalizes the so-called iterated Brownian motion. Khoshnevisan and Lewis [Ann. Appl. Probab. 9 (1999) 629–667] suggested “the existence of a form of measure-theoretic duality” be- tween iterated Brownian motion and a Brownian motion in random scenery. We show that a random walk at random time can be consid- ered a random walk in “alternating” scenery, thus hinting at a mech- anism behind this duality. Following Cohen and Samorodnitsky [Ann. Appl. Probab. 16 (2006) 1432–1461], we also consider alternating random reward schema asso- ciated to random walks at random times. Whereas random reward schema scale to local time fractional stable motions, we show that the alternating random reward schema scale to indicator fractional stable motions. Finally, we show that one may recursively “subordinate” random time processes to get new local time and indicator fractional stable motions and new stable processes in random scenery or at random times. When α = 2, the fractional stable motions given by the recur- sion are fractional Brownian motions with dyadic H ∈ (0, 1).
    [Show full text]