Stochastics II Lecture notes Prof. Dr. Evgeny Spodarev Ulm 2015 Contents 1 General theory of random functions1 1.1 Random functions.....................................1 1.2 Elementary examples...................................5 1.3 Regularity properties of trajectories...........................6 1.4 Differentiability of trajectories.............................. 11 1.5 Moments und covariance................................. 12 1.6 Stationarity and Independence.............................. 14 1.7 Processes with independent increments......................... 15 1.8 Additional exercises.................................... 16 2 Counting processes 19 2.1 Renewal processes..................................... 19 2.2 Poisson processes...................................... 28 2.2.1 Poisson processes................................. 28 2.2.2 Compound Poisson process........................... 33 2.2.3 Cox process..................................... 35 2.3 Additional exercises.................................... 35 3 Wiener process 38 3.1 Elementary properties................................... 38 3.2 Explicit construction of the Wiener process...................... 39 3.2.1 Haar- and Schauder-functions.......................... 39 3.2.2 Wiener process with a.s. continuous paths.................. 42 3.3 Distribution and path properties of Wiener processes................ 45 3.3.1 Distribution of the maximum.......................... 45 3.3.2 Invariance properties............................... 47 3.4 Additional exercises.................................... 50 4 Lévy Processes 52 4.1 Lévy Processes....................................... 52 4.1.1 Infinitely Divisibility............................... 52 4.1.2 Lévy-Khintchine Representation........................ 55 4.1.3 Examples...................................... 59 4.1.4 Subordinators................................... 61 4.2 Additional Exercises.................................... 64 5 Martingales 67 5.1 Basic Ideas......................................... 67 5.2 (Sub-, Super-)Martingales................................ 69 5.3 Uniform Integrability................................... 72 i ii Contents 5.4 Stopped Martingales.................................... 74 5.5 Lévy processes and Martingales............................. 78 5.6 Martingales and Wiener Process............................. 80 5.7 Additional Exercises.................................... 83 6 Stationary Sequences of Random Variables 86 6.1 Sequences of Independent Random Variables..................... 86 6.2 Stationarity in the Narrow Sense and Ergodic Theory................ 87 6.2.1 Basic Ideas..................................... 87 6.2.2 Mixing Properties and Ergodicity....................... 90 6.2.3 Ergodic Theorem................................. 94 6.3 Stationarity in the Wide Sense.............................. 96 6.3.1 Correlation Theory................................ 96 6.3.2 Orthogonal Random Measures......................... 98 6.3.3 Integral regarding an Orthogonal Random Measure............. 99 6.3.4 Spectral Representation............................. 100 6.4 Additional Exercises.................................... 101 Bibliography 101 1 General theory of random functions 1.1 Random functions Let Ω, A, P be a probability space and S, B a measurable space, Ω, S x g. Definition 1.1.1 A random element X ¢ Ω S is a ASB-measurable mapping (Notation: X > ASB), i.e., X 1B ω > Ω ¢ Xω > B > A,B > B. If X is a random element, then Xω is a realization of X for arbitrary ω > Ω. We say that the σ-algebra B of subsets of S is induced by the set system M (Elements of M are also subsets of S), if B F FaM F-σ-algebra on S (Notation: B σM). If S is a toplological or metric space, then M is often chosen as a class of all open sets of S and σM is called the Borel σ-algebra (Notation: B BS). Example 1.1.1 1. If S R, B BR, then a random element X is called a random variable. m m 2. If S R , B BR , m A 1, then X is called a random vector. Random variables and random vectors are considered in the lectures „Elementare Wahrscheinlichkeitsrechnung und Statistik“ and „Stochastik I“. m 3. Let S be the class of all closed sets of R . Let m M A > S ¢ A 9 B x g ,B – arbitrary compactum of R . Then X ¢ Ω S is a random closed set. m As an example we consider n independent uniformly distributed points Y1,...,Yn > 0, 1¥ and R1,...,Rn A 0 (almost surely) independent random variables, which are defined on the 8n > same probability space Ω, A, P as Y1,...,Yn. Consider X i 1BRi Yi , where Br x y m R ¢ SSyxSS B r. Obviously, this is a random closed set. An example of a realization is provided in Figure 1.1. Exercise 1.1.1 Let Ω, A and S, B be measurable spaces, B σM, where M is a class of subsets of S. Prove that X ¢ Ω S is ASB-measurable if and only if X 1C > A, C > M. Definition 1.1.2 Let T be an arbitrary index set and St, Btt>T a family of measurable spaces. A family X Xt, t > T of random elements Xt ¢ Ω St defined on Ω, A, P and ASBt-measurable for all t > T is called a random function (associated with St, Btt>T ). 1 2 1 General theory of random functions 86 Abb. 1.1: Example of a random set X i 1BRi Yi Therefore it holds X ¢ Ω T St, t > T , i.e. Xω, t > St for all ω > Ω, t > T and X , t > ASBt, t > T . We often omit ω in the notation and write Xt instead of Xω, t. Sometimes St, Bt does not depend on t > T as well: St, Bt S, B for all t > T . Special cases of random functions: 1. T b Z ¢ X is called a random sequence or stochastic process in discrete time. Example: T Z, N. 2. T b R ¢ X is called a stochastic process in continuous time. Example: T R, a, b¥, ª @ a @ b @ ª, R. d 3. T b R , d C 2 ¢ X is called a random field. d d d d Example: T Z , R, R , a, b¥ . d 4. T b BR ¢ X is called set-indexed process. If X is almost surely non-negative and σ-additive on the σ-algebra T , then X is called a random measure. The tradition of denoting the index set with T comes from the interpretation of t > T for the cases 1 and 2 as time parameter. For every ω > Ω, Xω, t, t > T is called a trajectory or path of the random function X. We would like to prove that the random function X Xt, t > T is a random element within the corresponding function space, which is equipped with a σ-algebra that now is spec- ified. Let ST Lt>T St be the cartesian product of St, t > T , i.e., x > ST if xt > St, t > T . The elementary cylindric set in ST is defined as CT B, t x > ST ¢ xt > B , where t > T is a selected point from T and B > Bt a subset of St. CT B, t therefore contains all trajectories x, which go through the „gate“ B, see Figure 1.2. Definition 1.1.3 The cylindric σ-algebra BT is introduced as a σ-algebra induced in ST by the family of all 1 General theory of random functions 3 Abb. 1.2: Trajectories which pass a „gate“ Bt. T elementary cylinders. It is denoted by BT at>T Bt. If Bt B for all t > T , then B is written instead of BT . Lemma 1.1.1 The family X Xt, t > T is a random function on Ω, A, P with phase spaces St, Btt>T if and only if for every ω > Ω the mapping ω ( Xω, is ASBT -measurable. Exercise 1.1.2 Proof Lemma 1.1.1. Definition 1.1.4 Let X be a random element X ¢ Ω S, i.e. X be ASB-measurable. The distribution of X is 1 the probability measure PX on S, B such that PX B PX B, B > B. Lemma 1.1.2 An arbitrary probability measure µ on S, B can be considered as the distribution of a random element X. Proof Take Ω S, A B, P µ and Xω ω, ω > Ω. When does a random function with given properties exist? A random function, which consists of independent random elements always exists. This assertion is known as Theorem 1.1.1 (Lomnicki, Ulam): Let St, Bt, µtt>T be a sequence of probability spaces. It exists a random sequence X Xt, t > T on a probability space Ω, A, P (associated with St, Btt>T ) such that 1. Xt, t > T are independent random elements. 2. PXt µt on St, Bt, t > T . A lot of important classes of random processes is built on the basis of independent random elements; cf. examples in Section 1.2. Definition 1.1.5 Let X Xt, t > T be a random function on Ω, A, P with phase space St, Btt>T . The T finite-dimensional distributions of X are defined as the distribution law Pt1,...,tn of X t1 ,...,X tn > > on St1,...,tn , Bt1,...,tn , for arbitrary n N, t1, . , tn T , where St1,...,tn St1 ... Stn and 4 1 General theory of random functions a a Bt1,...,tn Bt1 ... Btn is the σ-algebra in St1,...,tn , which is induced by all sets Bt1 ... Btn , > T > > Bti Bti , i 1, . , n, i.e., Pt1,...,tn C P X t1 ,...,X tn C , C Bt1,...,tn . In particular, > for C B1 ... Bn, Bk Btk one has > > Pt1,...,tn B1 ... Bn P X t1 B1,...,X tn Bn . Exercise 1.1.3 T S Prove that Xt1,...,tn X t1 ,...,X tn is a A Bt1,...,tn -measurable random element. Definition 1.1.6 Let St R for all t > T . The random function X Xt, t > T is called symmetric, if all of its finite-dimensional distributions are symmetric probability measures, i.e., Pt1,...,tn A > > > Pt1,...,tn A for A Bt1,...,tn and all n N, t1, . , tn T , whereby T > Pt1,...,tn A P X t1 ,..., X tn A . Exercise 1.1.4 Prove that the finite-dimensional distributions of a random function X have the following > C ` > properties: for arbitrary n N, n 2, t1, .
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages105 Page
-
File Size-