A Stochastic Processes and Martingales
Total Page:16
File Type:pdf, Size:1020Kb
A Stochastic Processes and Martingales A.1 Stochastic Processes Let I be either IINorIR+.Astochastic process on I with state space E is a family of E-valued random variables X = {Xt : t ∈ I}. We only consider examples where E is a Polish space. Suppose for the moment that I =IR+. A stochastic process is called cadlag if its paths t → Xt are right-continuous (a.s.) and its left limits exist at all points. In this book we assume that every stochastic process is cadlag. We say a process is continuous if its paths are continuous. The above conditions are meant to hold with probability 1 and not to hold pathwise. A.2 Filtration and Stopping Times The information available at time t is expressed by a σ-subalgebra Ft ⊂F.An {F ∈ } increasing family of σ-algebras t : t I is called a filtration.IfI =IR+, F F F we call a filtration right-continuous if t+ := s>t s = t. If not stated otherwise, we assume that all filtrations in this book are right-continuous. In many books it is also assumed that the filtration is complete, i.e., F0 contains all IIP-null sets. We do not assume this here because we want to be able to change the measure in Chapter 4. Because the changed measure and IIP will be singular, it would not be possible to extend the new measure to the whole σ-algebra F. A stochastic process X is called Ft-adapted if Xt is Ft-measurable for all t. If it is clear which filtration is used, we just call the process adapted.The {F X } natural filtration t is the smallest right-continuous filtration such that X is adapted. If we consider a process X, we always work with the natural filtration unless stated otherwise. Suppose that I =IIN. A stochastic process X is called predictable if {Xt+1} is Ft-adapted and X0 is not stochastic. Suppose that I =IR+.LetFt− = 202 A Stochastic Processes and Martingales F F s<t s be the smallest σ-algebra containing all s for s<t. We say that a stochastic process X is predictable if X is adapted to {Ft−}. Note that if X is adapted, then {Xt−} (with X0− = X0 deterministic) is predictable. Indeed, −1 {Xt− ≥ a} = {Xv >a− n } . n∈IIN∗ s∈Q∩[0,t) v∈Q∩(s,t) −1 Because {Xv >a− n }∈Fv ⊂Ft−,wehave{Xt− ≥ a}∈Ft−. A random variable T ∈ I ∪{∞} is called Ft-stopping time if {T ≤ t}∈Ft for all t ∈ I. Note that deterministic times are stopping times because {s ≤ t}∈{∅,Ω}.IfX is an adapted stochastic process and T is a stopping time, then the stopped process {XT ∧t} is also adapted. Let A ⊂ E be a Borel set. Define the first entrance times { ∈ ∈ } ∗ { ∈ ∈ ∈ } TA =inf t I : Xt A ,TA =inf t I : Xt A or Xt− A . The first entrance times are important examples of stopping times. If A is ∗ open, then TA is a stopping time. If A is closed, then TA is a stopping time. If A is compact or X is (pathwise) continuous, then TA is a stopping time. In many examples one needs the information up to a stopping time. For a stopping time T we define the σ-algebra FT = {A ∈F: A ∩{T ≤ t}∈Ft for all t ∈ I} . The σ-algebra FT contains all events observable until the stopping time T . A.3 Martingales An important concept for stochastic processes is the notion of a fair game, called a martingale.NowletE = IR. An adapted stochastic process M is called an Ft-submartingale if IIE[|Mt|] < ∞ for all t ∈ I and IIE[Mt+s |Ft] ≥ Mt for all t, s ∈ I. (A.1) If the inequality (A.1) is reversed, the process M is an Ft-supermartingale. A stochastic process that is both a sub- and a supermartingale is called a martingale. If diffusion processes are involved, it often turns out that the notion of a martingale is too strong. For example, stochastic integrals with respect to Brownian motion are martingales if one stops them when reaching a finite level. This is very close to a martingale. We therefore say that a stochastic process M is an Ft-local martingale if there exists a sequence of Ft-stopping times {Tn : n ∈ IIN} with limn→∞ Tn = ∞ such that the stopped processes { } F MTn∧t are t-martingales. Such a sequence of stopping times is called a A.4 Poisson Processes 203 localisation sequence. A localisation sequence can always be chosen to be in- { } creasing and such that MTn∧t is a uniformly integrable martingale for each n. The following results make martingales an important tool. The proofs can be found in [57] or [152]. Proposition A.1 (Convergence theorem). Let M be a submartingale, and suppose that + sup IIE[(Mt) ] < ∞ . t≥0 Then there exists a random variable M∞ such that limt→∞ Mt = M∞ and IIE[|M∞|] < ∞. It should be noted that one cannot necessarily interchange the limit and in- tegration. There are many examples where M0 = limt→∞ IIE[Mt] =II E[M∞]. Proposition A.2 (Stopping theorem). Let M be a submartingale, t ≥ 0, and T1,T2 be stopping times. Then |F ≥ IIE[MT1∧t T2 ] MT1∧T2∧t . { } In particular, the stopped process MT1∧t is a submartingale. When dealing with local martingales, the following result is often helpful. Lemma A.3. Let M be a positive local martingale. Then M is a supermartin- gale. Proof. Let {Tn} be a localisation sequence. By Fatou’s lemma, IIE[Mt+s |Ft]=IIE[ lim MT ∧(t+s) |Ft] ≤ lim IIE[MT ∧(t+s) |Ft] n→∞ n n→∞ n = lim MT ∧t = Mt . n→∞ n Thus, M is a supermartingale. A.4 Poisson Processes An increasing stochastic process with values in IINandN0 = 0 is called a point process. If the jumps are only one unit, i.e., Nt − Nt− ∈{0, 1} for all t, then we call a point process simple. A special case of a simple point process is the homogeneous Poisson pro- cess. This process has some special properties that make it easy to handle. It can be defined via one of the following equivalent conditions. A proof can be found in [152]. 204 A Stochastic Processes and Martingales Proposition A.4. Let N be a point process with occurrence times 0=T0 < T1 < ··· and λ>0. The following conditions are equivalent: i) N has stationary and independent increments such that IIP[Nh =0]=1− λh + o(h) , IIP[Nh =1]=λh + o(h) , as h ↓ 0. ii) N has independent increments and Nt is Poisson-distributed with param- eter λt for each t>0. iii) The interarrival times {Tk − Tk−1 : k ≥ 1} are independent and exponen- tially distributed with parameter λ. iv) For each t>0, Nt is Poisson-distributed with parameter λt and given {Nt = n} the occurrence points T1,...,Tn have the same distribution as the order statistics of n independent uniformly on (0,t) distributed random variables. v) N has independent increments, IIE[N1]=λ, and given {Nt = n} the occur- rence points T1,...,Tn have the same distribution as the order statistics of n independent uniformly on (0,t) distributed random variables. vi) N has stationary and independent increments such that IIE[N1]=λ and IIP[Nh ≥ 2] = o(h) as h ↓ 0. We call λ the rate of the Poisson process. { } Let Yi be a sequence of iid random variables independent of N.The { Nt } process i=1 Yi is called a compound Poisson process. The Poisson process is just a special case with Yi = 1. As the Poisson process, the compound Poisson process has independent and stationary increments. This leads to the following martingales. If IIE[|Yi|] < ∞, then Nt Yi − λIIE[Yi]t i=1 2 ∞ is a martingale. If IIE[Yi ] < , then N t 2 − − 2 Yi λIIE[Yi]t λIIE[Yi ]t i=1 is a martingale. If the moment-generating function IIE[exp{rYi}] exists, then the process Nt exp r Yi − λIIE[exp{rYi}−1]t i=1 is a martingale. The martingale property can easily be verified from the inde- pendent and stationary increments property. A.5 Brownian Motion 205 A.5 Brownian Motion A (cadlag) stochastic process W is called a standard Brownian motion if it has independent increments, W0 =0,andWt is normally distributed with mean value 0 and variance t. Wiener [186] proved that the Brownian motion 2 exists. A process {Xt = mt + σWt} is called an (m, σ ) Brownian motion. It turns out that a Brownian motion also has stationary increments and that its paths are continuous (a.s.). It is easy to verify that a standard Brownian motion is a martingale. A Brownian motion fluctuates rapidly. If we let T =inf{t : Xt < 0}, then T = 0. The Brownian motion therefore reaches strictly positive and strictly negative values immediately. The reason is that the Brownian motion has unbounded variation; see the definition ahead. One can also show that the sample path of a Brownian motion is nowhere differentiable. Brownian motion has bounded quadratic variation. Namely, for t>0, n − 2 lim (Wt Wt − ) = t, n→∞ i i 1 i=1 ··· − where 0 = t0 <t1 < <tn = t is a partition and supk tk tk−1 has to converge to zero with n. Indeed, n − 2 IIE (Wti Wti−1 ) = t i=1 and n n n − 2 − 2 − 2 Var (Wti Wti−1 ) = Var[(Wti Wti−1 ) ]= 2(ti ti−1) . i=1 i=1 i=1 Thus, the variance converges to zero with n.