Poisson Random Measures and Their Numerical Approximation

Total Page:16

File Type:pdf, Size:1020Kb

Poisson Random Measures and Their Numerical Approximation SPDES driven by Poisson Random Measures and their numerical Approximation Erika Hausenblas joint work with Thomas Dunst and Andreas Prohl University of Salzburg, Austria SPDES driven by Poisson Random Measures and their numerical Approximation – p.1 Outline Lévy processes - Poisson Random Measure Stochastic Integration in Banach spaces SPDEs driven by Poisson Random Measure Their Numerical Approximation SPDES driven by Poisson Random Measures and their numerical Approximation – p.2 A typical Example Let be a bounded domain in Rd with smooth boundary. O The Equation: du(t,ξ) ∂2 = 2 u(t, ξ)+ α u(t, ξ)+ g(u(t, ξ))L˙ (t) dt ∂ξ ∇ + f(u(t, ξ)), ξ , t> 0; ∈ O (⋆) u(0, ξ) = u (ξ) ξ ; 0 ∈ O u(t, ξ) = u(t, ξ) = 0, t 0, ξ ∂ ; ≥ ∈ O where u Lp( ), p 1, g a certain mapping and L(t) is a space time 0 ∈ O ≥ Lévy noise specified later. Problem: To find a function u : [0, ) R ∞ ×O −→ solving Equation (⋆). SPDES driven by Poisson Random Measures and their numerical Approximation – p.3 The Lévy Process L Definition 1 Let E be a Banach space. A stochastic process L = L(t), 0 t< is an E–valued Lévy process over (Ω; ; P) if the { ≤ ∞} F following conditions are satisfied: L(0) = 0; L has independent and stationary increments; L is stochastically continuous, i.e. for φ bounded and measurable, the function t Eφ(L(t)) is continuous on R+; 7→ L has a.s. càdlàg paths; SPDES driven by Poisson Random Measures and their numerical Approximation – p.4 The Lévy Process L E denotes a separable Banach space and E′ the dual on E. The Fourier Transform of L is given by the Lévy - Hinchin - Formula (see Linde (1986)): E ihL(1),ai iλhy,ai e = exp i y,a λ + e 1 iλy1{|y|≤1} ν(dy) , h i E − − Z where a E′, y E and ν : (E) R+ is a Lévy measure. ∈ ∈ B → SPDES driven by Poisson Random Measures and their numerical Approximation – p.5 The Lévy Process L E denotes a separable Banach space and E′ the dual on E. The Fourier Transform of L is given by the Lévy - Hinchin - Formula (see Linde (1986)): E ihL(1),ai iλhy,ai e = exp i y,a λ + e 1 iλy1{|y|≤1} ν(dy) , h i E − − Z where a E′, y E and ν : (E) R+ is a Lévy measure. ∈ ∈ B → If E = R, then the set of Lévy measures (R) is given by L (R)= ν : (R) [0, ); ν( 0 ) = 0; z 2ν(dz) < . L B → ∞ { } R | | ∞ Z SPDES driven by Poisson Random Measures and their numerical Approximation – p.5 Poisson Random Measure Remark 1 Let L be a real valued Lévy process and let A (R). ∈B Defining the counting measure N(t, A)= ♯ s (0, t] : ∆L(s)= L(s) L(s ) A IN { ∈ − − ∈ }∈ ∪ {∞} one can show, that N(t, A) is a random variable over (Ω; ; P); F N(t, A) Poisson (tν(A)) and N(t, ) = 0; ∼ ∅ For any disjoint sets A1,...,An, the random variables N(t, A1),...,N(t, An) are pairwise independent; SPDES driven by Poisson Random Measures and their numerical Approximation – p.6 Poisson Random Measure Definition 2 Let (S, ) be a measurable space and (Ω, , P) a probability S A space. A random measure on (S, ) is a family S η = η(ω, ), ω Ω { ∈ } of non-negative measures η(ω, ) : R+, such that the mapping S → η : (Ω, ) (M (S, ), (S, ))a A → + S M+ S is measurable. a M+(S, S) denotes the set of all non negative measures on (S, S) equipped with the weak topology and M+(S, S) is the σ field on M(S, S) generated by all functions iB : M(S, S) ∋ ν 7→ ν(B) ∈ [0, ∞) , B ∈ S. SPDES driven by Poisson Random Measures and their numerical Approximation – p.7 Poisson Random Measure Definition 3 Let (S, ) be a measurable space and (Ω, , P) a probability S A space. An integer valued Poisson random measure on (S, ) is a family S η = η(ω, ), ω Ω { ∈ } of non-negative measures η(ω, ) : IN, such that S → η(, ) = 0 a.s. ∅ η is a.s. σ–additive. η is a.s. independently scattered. for each A such that E η( , A) is finite, η( , A) is a Poisson random ∈ S · · variable with parameter E η( , A). · SPDES driven by Poisson Random Measures and their numerical Approximation – p.8 Poisson Random Measure Remark 2 The mapping A ν(A) := E η( , A) R S ∋ 7→ · ∈ is a measure on (S, ). S Assume S is a separable Banach space. Then the integral is X := s η˜(ds) ZS is well defined, iff ν is a Lévy measure. Moreover, the law of X is infinite divisible. SPDES driven by Poisson Random Measures and their numerical Approximation – p.9 Poisson Random Measure Definition 4 A law ρ on S is infinite divisible, iff for any n IN there ∈ exist a law ρ on S and a constant b S such that n n ∈ ρ = ρ∗(n) δ . n ∗ bn Definition 5 A random variable X in S has infinite divisible law, iff for any n IN there exist a family Xn : i = 1,...,n of independent and ∈ { i } identical distributed random vectors and a constant b S such that n ∈ (X)= (Xn + Xn + + Xn + b ). L L 1 2 · · · n n SPDES driven by Poisson Random Measures and their numerical Approximation – p.10 Poisson Random Measure Assume S = R and let X be given by X := s η˜(ds) ZS Then the law of X is infinitely divisible and given by ∞ k ∗(k) ν([ǫ, )) νǫ lim sup e−ν([ǫ,∞)) ∞ ǫ→0 k! Xk=0 where for ǫ> 0, the probability measure νǫ is defined by ν([ǫ, ) A) ν : (R) A ∞ ∩ [0, 1]. ǫ B ∋ 7→ ν([ǫ, )) ∈ ∞ SPDES driven by Poisson Random Measures and their numerical Approximation – p.11 Poisson Random Measure Let (E, ) be a measurable space. If S = E R+, = ˆ (R+), then a E × S E×B Poisson random measure on (S, ) is called Poisson point process. S Remark 3 Let ν be a Lévy measure on a Banach space E and S = E R+ • × = (E) ˆ (R+) • S B ×B ν′ = ν λ (λ is the Lebesgue measure). • × Then there exists a time homogeneous Poisson random measure η : Ω (E) (R+) IN ×B ×B → such that E η( A I)= ν(A)λ(I), A (E),I (R+), × × ∈B ∈B ν is called the intensity of η. SPDES driven by Poisson Random Measures and their numerical Approximation – p.12 Poisson Random Measure Definition 6 Let η : Ω (E) (R+) R+ ×B ×B → be a Poisson random measure over (Ω; ; P) and , 0 t< the F {Ft ≤ ∞} filtration induced by η. Then the predictable measure γ : Ω (E) (R+) R+ ×B ×B → is called compensator of η, if for any A (E) the process ∈B η˜(A (0, t]) := η(A (0, t]) γ(A [0, t]) × × − × is a local martingale over (Ω; ; P). F Remark 4 The compensator is unique up to a P-zero set and in case of a time homogeneous Poisson random measure given by γ(A [0, t]) = t ν(A), A (E). × ∈B SPDES driven by Poisson Random Measures and their numerical Approximation – p.13 Poisson Random Measure Example 1 Let η be a time homogeneous Poisson random measure over a Banach space E with intensity ν, where ν is a Lévy measure. Then, the stochastic process t t z η˜(dz, ds) 7→ Z0 ZE is a Lévy process with Lévy measure ν. SPDES driven by Poisson Random Measures and their numerical Approximation – p.14 Poisson Random Measure Definition 7 We call a R–valued Lévy process of type (α,β) for α [0, 2] and β 0, iff ∈ ≥ α = inf α˜ 0 : lim ν((x, )) xα˜ < ≥ x→0 ∞ ∞ n and lim ν(( , x)) xα˜ < x→0 −∞ − ∞ o ˜ β = inf β˜ 0 : x βν(dx) < . ≥ | | ∞ ( Z{|x|>1} ) SPDES driven by Poisson Random Measures and their numerical Approximation – p.15 The Numerical Approximation of a Lévy walk SPDES driven by Poisson Random Measures and their numerical Approximation – p.16 The Numerical Approximation of a Lévy walk Let η be a Poisson random measure on R with intensity ν, i.e. let L = L(t) : 0 t< be the corresponding Lévy process with characteristic { ≤ ∞} measure ν defined by t [0, ) t L(t) := z η˜(dz,ds) R. ∞ ∋ 7→ R ∈ Z0 Z SPDES driven by Poisson Random Measures and their numerical Approximation – p.17 The Numerical Approximation of a Lévy walk Let η be a Poisson random measure on R with intensity ν, i.e. let L = L(t) : 0 t< be the corresponding Lévy process with characteristic { ≤ ∞} measure ν defined by t [0, ) t L(t) := z η˜(dz,ds) R. ∞ ∋ 7→ R ∈ Z0 Z Question: SPDES driven by Poisson Random Measures and their numerical Approximation – p.17 The Numerical Approximation of a Lévy walk Let η be a Poisson random measure on R with intensity ν, i.e. let L = L(t) : 0 t< be the corresponding Lévy process with characteristic { ≤ ∞} measure ν defined by t [0, ) t L(t) := z η˜(dz,ds) R. ∞ ∋ 7→ R ∈ Z0 Z Question: Given the characteristic measure of a Lévy process, how to simulate the Lévy walk ? That means, how to simulate 0 1 2 k ∆τ L, ∆τ L, ∆τ L, ..., ∆τ L, ... , where ∆kL := L((k + 1)τ) L(kτ), k IN.
Recommended publications
  • Completely Random Measures and Related Models
    CRMs Sinead Williamson Background Completely random measures and related L´evyprocesses Completely models random measures Applications Normalized Sinead Williamson random measures Neutral-to-the- right processes Computational and Biological Learning Laboratory Exchangeable University of Cambridge matrices January 20, 2011 Outline CRMs Sinead Williamson 1 Background Background L´evyprocesses Completely 2 L´evyprocesses random measures Applications 3 Completely random measures Normalized random measures Neutral-to-the- right processes 4 Applications Exchangeable matrices Normalized random measures Neutral-to-the-right processes Exchangeable matrices A little measure theory CRMs Sinead Williamson Set: e.g. Integers, real numbers, people called James. Background May be finite, countably infinite, or uncountably infinite. L´evyprocesses Completely Algebra: Class T of subsets of a set T s.t. random measures 1 T 2 T . 2 If A 2 T , then Ac 2 T . Applications K Normalized 3 If A1;:::; AK 2 T , then [ Ak = A1 [ A2 [ ::: AK 2 T random k=1 measures (closed under finite unions). Neutral-to-the- right K processes 4 If A1;:::; AK 2 T , then \k=1Ak = A1 \ A2 \ ::: AK 2 T Exchangeable matrices (closed under finite intersections). σ-Algebra: Algebra that is closed under countably infinite unions and intersections. A little measure theory CRMs Sinead Williamson Background L´evyprocesses Measurable space: Combination (T ; T ) of a set and a Completely σ-algebra on that set. random measures Measure: Function µ between a σ-field and the positive Applications reals (+ 1) s.t. Normalized random measures 1 µ(;) = 0. Neutral-to-the- right 2 For all countable collections of disjoint sets processes P Exchangeable A1; A2; · · · 2 T , µ([k Ak ) = µ(Ak ).
    [Show full text]
  • Levy Processes
    LÉVY PROCESSES, STABLE PROCESSES, AND SUBORDINATORS STEVEN P.LALLEY 1. DEFINITIONSAND EXAMPLES d Definition 1.1. A continuous–time process Xt = X(t ) t 0 with values in R (or, more generally, in an abelian topological groupG ) isf called a Lévyg ≥ process if (1) its sample paths are right-continuous and have left limits at every time point t , and (2) it has stationary, independent increments, that is: (a) For all 0 = t0 < t1 < < tk , the increments X(ti ) X(ti 1) are independent. − (b) For all 0 s t the··· random variables X(t ) X−(s ) and X(t s ) X(0) have the same distribution.≤ ≤ − − − The default initial condition is X0 = 0. A subordinator is a real-valued Lévy process with nondecreasing sample paths. A stable process is a real-valued Lévy process Xt t 0 with ≥ initial value X0 = 0 that satisfies the self-similarity property f g 1/α (1.1) Xt =t =D X1 t > 0. 8 The parameter α is called the exponent of the process. Example 1.1. The most fundamental Lévy processes are the Wiener process and the Poisson process. The Poisson process is a subordinator, but is not stable; the Wiener process is stable, with exponent α = 2. Any linear combination of independent Lévy processes is again a Lévy process, so, for instance, if the Wiener process Wt and the Poisson process Nt are independent then Wt Nt is a Lévy process. More important, linear combinations of independent Poisson− processes are Lévy processes: these are special cases of what are called compound Poisson processes: see sec.
    [Show full text]
  • Some Theory for the Analysis of Random Fields Diplomarbeit
    Philipp Pluch Some Theory for the Analysis of Random Fields With Applications to Geostatistics Diplomarbeit zur Erlangung des akademischen Grades Diplom-Ingenieur Studium der Technischen Mathematik Universit¨at Klagenfurt Fakult¨at fur¨ Wirtschaftswissenschaften und Informatik Begutachter: O.Univ.-Prof.Dr. Jurgen¨ Pilz Institut fur¨ Mathematik September 2004 To my parents, Verena and all my friends Ehrenw¨ortliche Erkl¨arung Ich erkl¨are ehrenw¨ortlich, dass ich die vorliegende Schrift verfasst und die mit ihr unmittelbar verbundenen Arbeiten selbst durchgefuhrt¨ habe. Die in der Schrift verwendete Literatur sowie das Ausmaß der mir im gesamten Arbeitsvorgang gew¨ahrten Unterstutzung¨ sind ausnahmslos angegeben. Die Schrift ist noch keiner anderen Prufungsb¨ eh¨orde vorgelegt worden. St. Urban, 29 September 2004 Preface I remember when I first was at our univeristy - I walked inside this large corridor called ’Aula’ and had no idea what I should do, didn’t know what I should study, I had interest in Psychology or Media Studies, and now I’m sitting in my office at the university, five years later, writing my final lines for my master degree theses in mathematics. A long and also hard but so beautiful way was gone, I remember at the beginning, the first mathematic courses in discrete mathematics, how difficult that was for me, the abstract thinking, my first exams and now I have finished them all, I mastered them. I have to thank so many people and I will do so now. First I have to thank my parents, who always believed in me, who gave me financial support and who had to fight with my mood when I was working hard.
    [Show full text]
  • Completely Random Measures
    Pacific Journal of Mathematics COMPLETELY RANDOM MEASURES JOHN FRANK CHARLES KINGMAN Vol. 21, No. 1 November 1967 PACIFIC JOURNAL OF MATHEMATICS Vol. 21, No. 1, 1967 COMPLETELY RANDOM MEASURES J. F. C. KlNGMAN The theory of stochastic processes is concerned with random functions defined on some parameter set. This paper is con- cerned with the case, which occurs naturally in some practical situations, in which the parameter set is a ^-algebra of subsets of some space, and the random functions are all measures on this space. Among all such random measures are distinguished some which are called completely random, which have the property that the values they take on disjoint subsets are independent. A representation theorem is proved for all completely random measures satisfying a weak finiteness condi- tion, and as a consequence it is shown that all such measures are necessarily purely atomic. 1. Stochastic processes X(t) whose realisation are nondecreasing functions of a real parameter t occur in a number of applications of probability theory. For instance, the number of events of a point process in the interval (0, t), the 'load function' of a queue input [2], the amount of water entering a reservoir in time t, and the local time process in a Markov chain ([8], [7] §14), are all processes of this type. In many applications the function X(t) enters as a convenient way of representing a measure on the real line, the Stieltjes measure Φ defined as the unique Borel measure for which (1) Φ(a, b] = X(b + ) - X(a + ), (-co <α< b< oo).
    [Show full text]
  • Poisson Point Processes
    Poisson Point Processes Will Perkins April 23, 2013 The Poisson Process Say you run a website or a bank. How woul you model the arrival of customers to your site? Continuous time process, integer valued. What properties should the process have? Properties 1 The numbers of customers arriving in disjoint time intervals are independent. 2 The number of customers arriving in [t1; t2] depends only on t2 − t1. (Can be relaxed) 3 The probability that one customer arrives in [t; t + ] is λ + o(). 4 The probability that at least two customers arrive in [t; t + ] is o(). The Poisson Process Theorem If a process N(t1; t2) satisfies the above properties, then N(0; t) has a Poisson distribution with mean λt. Such a process is called a Poisson process. Proof: Other Properties 1 Conditioning on the number of arrivals in [0; T ], how are the arrival times distributed? 2 What is the distribution of the time between arrival k and k + 1? 3 Does this process have the continuous-time Markov property? Proofs: Constructing a Poisson Process We can construct a Poisson process using a sequence of iid random variables. Let X1; X2;::: be iid Exponential rv's with mean 1/λ. Then let k+1 X N(0; t) = inffk : Xi ≥ t i=1 Show that this is a Poisson process with mean λ. What would happend if we chose a different distribution for the Xi 's? Inhomogeneous Poisson Process Let f (t) be a non-negative, integrable function. Then we can define an inhomogeneous Poisson process with intensity measure f (t) as follows: 1 The number of arrivals in disjoint intervals are independent.
    [Show full text]
  • On a Class of Random Probability Measures with General Predictive Structure
    On a class of random probability measures with general predictive structure Stefano Favaro Igor Prünster Stephen G. Walker No. 161 November 2010 www.carloalberto.org/working_papers © 2010 by Stefano Favaro, Igor Prünster and Stephen G. Walker. Any opinions expressed here are those of the authors and not those of the Collegio Carlo Alberto. On a class of random probability measures with general predictive structure S. Favaro1, I. Pr¨unster2 and S.G. Walker3 1 Universit`adegli Studi di Torino and Collegio Carlo Alberto, Torino, Italy. E-mail: [email protected] 2 Universit`adegli Studi di Torino, Collegio Carlo Alberto and ICER, Torino, Italy. E-mail: [email protected] 3 Institute of Mathematics, Statistics and Actuarial Science, University of Kent E-mail: [email protected] February 2010 Abstract In this paper we investigate a recently introduced class of nonparametric priors, termed generalized Dirichlet process priors. Such priors induce (exchangeable random) partitions which are characterized by a more elaborate clustering structure than those arising from other widely used priors. A natural area of application of these random probability measures is represented by species sampling problems and, in particular, prediction problems in genomics. To this end we study both the distribution of the number of distinct species present in a sample and the distribution of the number of new species conditionally on an observed sample. We also provide the Bayesian nonparametric estimator for the number of new species in an additional sample of given size and for the discovery probability as function of the size of the additional sample.
    [Show full text]
  • On Strongly Rigid Hyperfluctuating Random Measures
    On strongly rigid hyperfluctuating random measures M.A. Klatt∗ and G. Lasty August 26, 2020 Abstract In contrast to previous belief, we provide examples of stationary ergodic random measures that are both hyperfluctuating and strongly rigid. Therefore, we study hyperplane intersection processes (HIPs) that are formed by the vertices of Poisson hyperplane tessellations. These HIPs are known to be hyperfluctuating, that is, the variance of the number of points in a bounded observation window grows faster than the size of the window. Here we show that the HIPs exhibit a particularly strong rigidity property. For any bounded Borel set B, an exponentially small (bounded) stopping set suffices to reconstruct the position of all points in B and, in fact, all hyperplanes intersecting B. Therefore, also the random measures supported by the hyperplane intersections of arbitrary (but fixed) dimension, are hyperfluctuating. Our examples aid the search for relations between correlations, density fluctuations, and rigidity properties. Keywords: Strong rigidity, hyperfluctuation, hyperuniformity, Poisson hyperplane tessellations, hyperplane intersection processes AMS MSC 2010: 60G55, 60G57, 60D05 1 Introduction Let Φ be a random measure on the d-dimensional Euclidean space Rd; see [10, 14]. In this note all random objects are defined over a fixed probability space (Ω; F; P) with associated expectation operator E. Assume that Φ is stationary, that is distributionally invariant 2 arXiv:2008.10907v1 [math.PR] 25 Aug 2020 under translations. Assume also that Φ is locally square integrable, that is E[Φ(B) ] < 1 for all compact B ⊂ Rd. Take a convex body W , that is a compact and convex subset of d R and assume that W has positive volume Vd(W ).
    [Show full text]
  • Completely Random Measures for Modeling Power Laws in Sparse Graphs
    Completely random measures for modeling power laws in sparse graphs Diana Cai Tamara Broderick Department of Statistics Department of EECS University of Chicago Massachusetts Institute of Technology [email protected] [email protected] Abstract Network data appear in a number of applications, such as online social networks and biological networks, and there is growing interest in both developing mod- els for networks as well as studying the properties of such data. Since individual network datasets continue to grow in size, it is necessary to develop models that accurately represent the real-life scaling properties of networks. One behavior of interest is having a power law in the degree distribution. However, other types of power laws that have been observed empirically and considered for applica- tions such as clustering and feature allocation models have not been studied as frequently in models for graph data. In this paper, we enumerate desirable asymp- totic behavior that may be of interest for modeling graph data, including sparsity and several types of power laws. We outline a general framework for graph gen- erative models using completely random measures; by contrast to the pioneering work of Caron and Fox (2015), we consider instantiating more of the existing atoms of the random measure as the dataset size increases rather than adding new atoms to the measure. We see that these two models can be complementary; they respectively yield interpretations as (1) time passing among existing members of a network and (2) new individuals joining a network. We detail a particular instance of this framework and show simulated results that suggest this model exhibits some desirable asymptotic power-law behavior.
    [Show full text]
  • Random Measures, Theory and Applications
    Olav Kallenberg Random Measures, Theory and Applications 123 Olav Kallenberg Department of Mathematics and Statistics Auburn University Auburn, Alabama USA ISSN 2199-3130 ISSN 2199-3149 (electronic) Probability Theory and Stochastic Modelling ISBN 978-3-319-41596-3 ISBN 978-3-319-41598-7 (eBook) DOI 10.1007/978-3-319-41598-7 Library of Congress Control Number: 2017936349 Mathematics Subject Classification (2010): 60G55, 60G57 © Springer International Publishing Switzerland 2017 This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, express or implied, with respect to the material contained herein or for any errors or omissions
    [Show full text]
  • Dependent Normalized Random Measures
    Dependent Normalized Random Measures Changyou Chen A thesis submitted for the degree of Doctor of Philosophy The Australian National University February 2014 c Changyou Chen 2014 To all my family, on earth and in heaven, for their selfless care and love. Acknowledgments It is time to say goodbye to my PhD research in the Australian National University. It has been an exciting and memorable experience and I would like to take this opportunity to thank everyone who has helped me during my PhD. My utmost gratitude goes to my supervisor, Professor Wray Buntine, for both his extraordinary supervision in my research and selfless care in my daily life. Wray brought me into the Bayesian nonparametrics world, he has taught me how to per- form quality research from the beginning of my PhD. He taught me how to do critical thinking, how to design professional experiments, how to write quality papers, and how to present my research work. I benefited greatly from his experience which has helped me develop as a machine learning researcher. I would like to express my gratitude and respect to Professor Yee Whye Teh, who hosted me as a visiting student at UCL for a short time in 2012. My special thanks also goes to Vinayak Rao, who explained to me his spatial normalized Gamma pro- cesses, a precursor of our joint work together with Yee Whye. I was impressed by Yee Whye and Vinayak’s knowledge in Bayesian nonparametrics, and their sophisticated ideas for Bayesian posterior inference. I was lucky to work with them and had our joint work published in ICML.
    [Show full text]
  • [Math.PR] 3 Dec 2019 Some Remarks on Associated Random Fields
    Some remarks on associated random fields, random measures and point processes G. Last∗, R. Szekli† and D. Yogeshwaran‡ December 4, 2019 Abstract In this paper, we first show that for a countable family of random elements taking values in a partially ordered Polish space with a closed partial order (POP space), association (both positive and negative) of all finite dimensional marginals implies that of the infinite sequence. Our proof proceeds via Strassen’s theorem for stochas- tic domination and thus avoids the assumption of normally ordered on the product space as needed for positive association in [38]. We use these results to show on POP spaces that finite dimensional negative association implies negative association of the random measure and negative association is preserved under weak convergence of random measures. The former provides a simpler proof in the most general set- ting of Polish spaces complementing the recent proofs in [47] and [40] which restrict to point processes in Rd and locally compact Polish spaces respectively. We also provide some examples of associated random measures which shall illustrate our results as well. Keywords: negative association, positive association, random fields, random measures, point processes, weak convergence, Gaussian random fields, Poisson processes, Cox pro- cesses, Poisson cluster processes determinantal point processes, Gibbs point processes. AMS MSC 2010: 60E15, 60G57. arXiv:1903.06004v2 [math.PR] 3 Dec 2019 1 Introduction Positive association of random vectors in Rd appears in Esary et al [14] in 1967, and negative association several years later, see Joag-Dev and Proschan [25] or Alam and Lai Saxena [1]. Since then the theory of positive association has been well developed and has found many applications in various contexts, for example to establish limit theorems, to obtain concentration bounds or to derive stochastic comparison results.
    [Show full text]
  • Point Processes
    POINT PROCESSES Frederic Paik Schoenberg UCLA Department of Statistics, MS 8148 Los Angeles, CA 90095-1554 [email protected] July 2000 1 A point process is a random collection of points falling in some space. In most applications, each point represents the time and/or location of an event. Examples of events include incidence of disease, sightings or births of a species, or the occurrences of fires, earthquakes, lightning strikes, tsunamis, or volcanic eruptions. When modeling purely temporal data, the space in which the points fall is simply a portion of the real line (Figure 1). Increas- ingly, spatial-temporal point processes are used to describe environmental processes; in such instances each point represents the time and location of an event in a spatial-temporal region (Figure 2). 0 s t T time Figure 1: Temporal point process DEFINITIONS There are several ways of characterizing a point process. The mathematically- favored approach is to define a point process N as a random measure on a space S taking values in the non-negative integers Z+ (or infinity). In this framework the measure N(A) represents the number of points falling in the 2 subset A of S. Attention is typically restricted to random measures that are finite on any compact subset of S, and to the case where S is a complete separable metric space (e.g. Rk). latitude A time Figure 2: Spatial-temporal point process For instance, suppose N is a temporal point process. For any times s and t, the measure N assigns a value to the interval (s, t).
    [Show full text]