Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Prakash Balachandran Department of Mathematics Duke University

April 2, 2008

1 Review of Discrete-Time Martingales

We assume the reader is familiar with the basic results from , particularly those con- cerning discrete-time martingales. Here, we briefly review the major theorems and definitions. For more details, see Patrick Billingsley’s Probability and Measure or Richard Durrett’s Probability: Theory and Examples.

Throughout, (Ω, F,P ) is a probability space, and B(X) denotes the Borel σ-algebra generated by a topological space X.

∞ Definition 1 A filtration is a collection of σ-algebras, {Fn}n=1 such that Fn ⊆ Fn+1.

∞ Definition 2 Let {Fn}n=1 be a filtration. A stopping time T is a T : (Ω, F) → (N ∪ {∞}, B(N)) such that {ω ∈ Ω: T (ω) = n} ∈ Fn for all n < ∞.

Definition 3 Let I be an index set, and {Xi}i∈I be a collection of random variables. The collection

{Xi}i∈I is said to be uniformly integrable if   lim sup E [|Xi| : |Xi| > M] < ∞ M→∞ i∈I

Definition 4 If f, g are random variables, then f ∧ g is the random variable defined by (f ∧ g)(ω) = min{f(ω), g(ω)}.

Definition 5 If T is a stopping time and X is a random variable, then XT is the random variable defined by XT (ω) = XT (ω)(ω).

∞ Definition 6 A collection of random variables {Xn}n=0 is called a discrete-time martingale (resp. discrete-time submartingale) if

1 ∞ 1. There exists a filtration {Fn}n=0.

2. Xn is Fn-adapted; i.e. Xn is Fn-measurable.

1 3. Xn ∈ L (Ω, F,P ), ∀n ≥ 0.

4. E[Xn+1|Fn] = Xn (resp. E[Xn+1|Fn] ≥ Xn), ∀n ≥ 0.

Now we need to review the notion of an upcrossing. Let [α, β] be an interval (α < β), and let X1,...,Xn be random variables. Inductively define τ1, τ2,... by:

τ1(ω) is the smallest j such that 1 ≤ j ≤ n and Xj(ω) ≤ α, and is n if there is no such j.

τk(ω) for even k is the smallest j such that τk−1(ω) < j ≤ n and Xj(ω) ≥ β and is n if there is no such j.

τk(ω) for odd k exceeding 1 is the smallest j such that τk−1(ω) < j ≤ n and Xj(ω) ≤ α and is n if there is no such j.

The number Un(ω) of upcrossings of [α, β] by X1,...,Xn is the largest j such that Xτ2j−1 ≤ α < β ≤

Xτ2j .

Now, recall the basic results from discrete time martingales:

∞ Theorem 1 (Doob’s Inequality): If {Xn}n=0 is a submartingale and λ > 0, then,   + + E[Xn ] P ω ∈ Ω : max Xm(ω) ≥ λ ≤ . 0≤m≤n λ ∞ Theorem 2 (The Upcrossing Inequality): If {Xm}m=0 is a submartingale, then the number of up- crossings Un on [a, b] satisfies E[(X − a)+] − E[(X − a)+] E[U ] ≤ n 0 . n b − a ∞ Theorem 3 (Doob’s Maximal Inequality): If {Xn}n=0 is a submartingale, then for 1 < p < ∞,  p  p + p + p E max Xm ≤ E[(Xn ) ]. 0≤m≤n p − 1 ∞ + Theorem 4 (The Martingale Convergence Theorem): If {Xn}n=0 is a submartingale such that sup E[Xn ] <

∞ for all n, then as n → ∞, Xn converges a.s. to a limit X with E[|X|] < ∞.

∞ Theorem 5 (The Optional Stopping Theorem): If L ≤ M are stopping times and {YM∧n}n=0 is a uniformly integrable submartingale, then E[YL] ≤ E[YM ] and

YL ≤ E[YM |FL] where

FL = {A ∈ F : A ∩ {ω ∈ Ω: L(ω) ≤ k} ∈ Fk, 1 ≤ k < ∞}.

2 2 Continuous-Time Martingales

We would like to extend the definitions and theorems in (1) to similar definitions and theorems involving continuous time martingales.

Definition 7 A filtration is a collection of σ-algebras, {Ft}t≥0 such that Fs ⊆ Ft for s < t.

We recall the interpretation of σ-algebras as containing information: Ft is the information available to the adapted process {Xt}t≥0 up to (and including) time t. Thus, it’s reasonable to interpret ! [ Ft− = σ Fs s 0 and

\ Ft+ = Ft+ >0 as the σ field of events immediately after t ≥ 0. We take the definition F0− = F0 and say that the

filtration {Ft}t≥0 is right- (resp. left-) continuous if Ft = Ft+ (resp. Ft = Ft−) holds for every t ≥ 0.

Definition 8 Let {Ft}t≥0 be a filtration. A stopping time T is a random variable T : (Ω, F) →

([0, ∞], B([0, ∞])) such that {ω ∈ Ω: T (ω) ≤ t} ∈ Ft for every t ≥ 0. A random variable T ,

T : (Ω, F) → ([0, ∞], B([0, ∞])) is called an optional time of the filtration if {ω ∈ Ω: T (ω) < t} ∈ Ft for every t ≥ 0.

We have the following immediate result:

Proposition 1 Every stopping time is optional, and the two concepts coincide if the filtration is right- continuous.

Proof:

 1  If T is a stopping time, then T ≤ t − ∈ F 1 ⊆ Ft for all n ≥ 1, n ∈ Z. So: n t− n

∞ [  1  [T < t] = T ≤ t − ∈ F . n t n=1

Now, suppose that T is an optional time, and {Ft}t≥0 is a right continuous filtration.

Fix  > 0, and notice that for η <  rational, Ft+η ⊆ Ft+.

This implies [T < t + η] ∈ Ft+η ⊆ Ft+ since T is an optional time, so that \ [T ≤ t] = [T < t + η] ∈ Ft+. 0<η< rational

3 Since  > 0 was arbitrary, this implies that [T ≤ t] ∈ Ft+ for every  > 0, so that \ [T ≤ t] ∈ Ft+ = Ft+ = Ft >0 since the filtration is right continuous. 

Definition 9 Let I be an index set, and {Xi}i∈I be a collection of random variables. The collection

{Xi}i∈I is said to be uniformly integrable if   lim sup E [|Xi| : |Xi| > M] < ∞ M→∞ i∈I

Definition 10 If f, g are random variables, then f ∧ g is the random variable defined by (f ∧ g)(ω) = min{f(ω), g(ω)}.

Definition 11 If T is a stopping time and {Xt}t≥0 is a , then XT is the random variable defined by XT (ω) = XT (ω)(ω).

Definition 12 Let T be a stopping time of the filtration {Ft}t≥0. The σ-algebra FT of events deter- mined prior to the stopping time T consists of those events A ∈ F for which

A ∩ {ω ∈ Ω: T (ω) ≤ t} ∈ Ft for every t ≥ 0. Clearly, if {Xt}t≥0 is adapted to {Ft}t≥0, then XT is FT measurable.

Definition 13 Let T be an optional time of the filtration {Ft}t≥0. The σ-field FT + of events determined immediately after the optional time T consists of those events A ∈ F for which A ∩ {T ≤ t} ∈ Ft+ for every t ≥ 0.

∞ Definition 14 A collection of random variables {Xn}n=0 is called a continuous-time martingale (resp. continuous-time submartingale) if

1. There exists a filtration {Ft}t≥0.

2. Xt is Ft-adapted; i.e. Xt is Ft-measurable.

1 3. Xt ∈ L (Ω, F,P ) for each t ≥ 0.

4. E[Xs|Ft] = Xt (resp. E[Xs|Ft] ≥ Xt) for 0 ≤ t < s < ∞.

Now we need to define the notion of an upcrossing.

Let {Xt}t≥0 be a real-valued stochastic process. Consider two numbers α < β, and a finite subset F of [0, ∞). We define the number of upcrossings UF (α, β; X(ω)) of the interval [α, β] by the restricted sample path {Xt(ω); t ∈ F } as follows. Set

τ1(ω) = min{t ∈ F ; Xt(ω) ≤ α}

4 and define recursively for j = 1, 2,...

σj(ω) = min{t ∈ F ; t ≥ τj(ω),Xt(ω) > β}

τj+1(ω) = min{t ∈ F ; t ≥ σj(ω),Xt(ω) < α}.

We define the minimum of the empty set to be ∞ and let UF (α, β; X(ω)) to be the largest number j such that σj(ω) < ∞. If I ⊂ [0, ∞) is not necessarily finite, we define

UI (α, β; X(ω)) = sup{UF (α, β; X(ω)); F ⊆ I, F finite}.

Theorem 6 Let {Xt}t≥0 be a submartingale w.r.t. {Ft}t≥0, whose every path is right continuous. Let [σ, τ] be a subinterval of [0, ∞) and let α < β, λ > 0 be real numbers. We have the following results:

1. (Doob’s Inequality):   + E[Xτ ] P ω ∈ Ω : sup Xt(ω) ≥ λ ≤ . σ≤t≤τ λ

2. (The Upcrossing Inequality):

E [X+] + |α| E[U (α, β, X(ω))] ≤ τ . [σ,τ] β − α

3. (Doob’s Maximal Inequality):  p  p p p E sup Xt ≤ E[Xt ] σ≤t≤τ p − 1

p for p > 1, provided Xt ≥ 0 a.s. for every t ≥ 0 and Xτ ∈ L (Ω, F,P ).

Proof:

1) Consider the enumeration

σ = t0 < t1 < ···

Sj ∞ of the numbers in {σ, τ} ∪ ([σ, τ] ∩ Q). Define Fj = i=0 tj ∪ {τ}. Clearly, {Fj}j=0 is an increasing sequence of finite sets such that

∞ ˜ [ F = Fj = {σ, τ} ∪ ([σ, τ] ∩ Q) . j=0

n o Let Aj = ω ∈ Ω : supt∈Fj Xt(ω) ≥ λ . Then,

Fj ⊆ Fj+1 ⇒ sup Xt(ω) ≤ sup Xt(ω) ⇒ Aj ⊆ Aj+1. t∈Fj t∈Fj+1

5 Using continuity from below on

( ) ∞ [ A∞ = ω ∈ Ω : sup Xt(ω) ≥ λ = Aj t∈F˜ j=0

(note that {Aj} are measurable) we have

P (A∞) = lim P (An). n→∞

+ E[X ] + max Fj E[Xτ ] By the discrete-time Doob inequality: P {Aj} ≤ λ = λ where max Fj denotes the largest ∞ element in the set Fj, which is τ by construction of {Fj}j=0, so that

+ E[Xτ ] P (A∞) = lim P (Aj) ≤ . j→∞ λ

Claim 1 supt∈F˜ Xt(ω) = supt∈[σ,τ] Xt(ω) Proof of Claim 1: ˜ Since F ⊆ [σ, τ], supt∈F˜ Xt(ω) ≤ supt∈[σ,τ] Xt(ω).

∞ On the other hand, since {Xt}t≥0 is right continuous, Xt(ω) = limj→∞ Xqj (ω) where {qj}j=1 is a ˜ sequence in F such that limj→∞ qj = t, qj ≥ t. So, since Xqj (ω) ≤ supt∈F˜ Xt(ω),

Xt(ω) ≤ sup Xt(ω) ⇒ sup Xt(ω) ≤ sup Xt(ω). t∈F˜ t∈[σ,τ] t∈F˜

Thus, supt∈F˜ Xt(ω) = supt∈[σ,τ] Xt(ω). 4

By Claim 1, we have that ( ) ( ) A∞ = ω ∈ Ω : sup Xt(ω) ≥ λ = ω ∈ Ω : sup Xt(ω) ≥ λ t∈F˜ t∈[σ,τ] so that   + E[Xτ ] P ω ∈ Ω : sup Xt(ω) ≥ λ = P (A∞) ≤ . σ≤t≤τ λ

∞ 2) Let {Fj}j=0 be as in (1).

Claim 2 U[σ,τ](α, β; X(ω)) = supj≥0{UFj (α, β; X(ω))}

6 ∞ Proof of Claim 2: Since each member of {Fj}j=0 is a finite subset of [σ, τ], we have by the definition of U[σ,τ],

U[σ,τ](α, β; X(ω)) ≥ sup{UFj (α, β; X(ω))}. j≥0

On the other hand, if F ⊆ [σ, τ] is finite, and x ∈ F is its largest element, there exists Fk such that if t is the largest element of Fk x ≤ t. Since U[σ,t] is increasing in t, we have UF ≤ UFk so that

U[σ,τ](α, β; X(ω)) ≤ supj≥0{UFj (α, β; X(ω))}. Thus, U[σ,τ](α, β; X(ω)) = supj≥0{UFj (α, β; X(ω))}. 4

The reason we needed to prove this was that in general U[σ,τ] is not the supremum over a countable collection of functions, since there are uncountably many F ⊂ [σ, τ] such that F is finite; thus, U[σ,τ] need not in general be measurable.

But, by Claim 2, UFj ↑ U[σ,τ] since Fj ⊆ Fj+1. Thus, it is measurable and since

h + i E X + |α| + max Fj E [X ] + |α| E[U (α, β, X(ω))] ≤ = τ Fj β − α β − α for the finite sets Fj by the discrete-time upcrossing inequality, we must have

E [X+] + |α| E[U (α, β, X(ω))] ≤ τ [σ,τ] β − α

1 by the monotone convergence theorem. In particular, this shows that U[σ,τ] ∈ L (Ω, F,P ) since {Xt}t≥0 + a submartingale implies E[Xτ ] ≤ E[|Xτ |] < ∞, and U[σ,τ] ≥ 0 implies U[σ,τ] = |U[σ,τ]|. Thus, U[σ,τ] is finite a.e.

∞ ˜ 3) Let {Fj}j=0 and F be as in (1).

  Claim 3 supσ≤t≤τ Xt = supj≥0 supt∈Fj Xt Proof of Claim 3:

Since Fj ⊂ [σ, τ] for all j ≥ 0, we must have ! sup Xt ≤ sup Xt, ∀ j ≥ 0 ⇒ sup sup Xt ≤ sup Xt. t∈Fj σ≤t≤τ j≥0 t∈Fj σ≤t≤τ

Conversely, consider Xt. Then, since {Xt}t≥0 is right continuous, Xt(ω) = limj→∞ Xqj (ω) where ∞ ˜ {qj}j=1 is a sequence in F such that limj→∞ qj = t and qj ≥ t for all j ≥ 1.

7 ∞ Now, if Gj is the set in {Fk}k=0 such that qj ∈ Gj, then: ! ! Xq ≤ sup Xt ≤ sup sup Xt ∀ j ≥ 1 ⇒ Xt = lim Xq ≤ sup sup Xt j j→∞ j t∈Gj k≥0 t∈Fk j≥0 t∈Fj ! ⇒ sup Xt ≤ sup sup Xt . σ≤t≤τ j≥0 t∈Fj

  Thus, supσ≤t≤τ Xt = supj≥0 supt∈Fj Xt . 4

Since Yj = supt∈Fj Xt is increasing (recall Fj ⊆ Fj+1), we have from Claim 3 that Yj ↑ supσ≤t≤τ Xt

Thus, supσ≤t≤τ Xt is measurable, since obviously, each Yj is measurable.

So, the discrete-time Doob maximal inequality:

 p  p p  p p E[Y p] = E max X ≤ E[(X+ )p] = E[(X )p] j t max Fj τ t∈Fj p − 1 p − 1 implies  p  p p p E max Xt ≤ E[(Xτ ) ] σ≤t≤τ p − 1 p by the monotone convergence theorem, since Xt ≥ 0 a.s. for every t ≥ 0 and Xτ ∈ L (Ω, F,P ). 

8 Theorem 7 (The Martingale Convergence Theorem): Let {Xt}t≥0 be a right continuous submartin- + gale with respect to {Ft}t≥0. If C := supt≥0 E[Xt ] < ∞ then X∞(ω) := limt→∞ Xt(ω) exists for 1 a.e. ω ∈ Ω and X∞ ∈ L (Ω, F,P ).

Proof:

From Theorem 6(2), we have for any n ≥ 1 and real numbers α < β:

E [X+] + |α| E[U (α, β, X(ω))] ≤ n . [0,n] β − α

+ + By hypothesis, E [Xn ] ≤ supt≥0 E[Xt ] = C < ∞, so that the above is bounded for all n ≥ 1.

So, since U[0,n] ≤ U[0,n+1], the pointwise limit limn→∞ U[0,n] = U[0,∞) exists, so that by the monotone convergence theorem and the above,

C + |α| E[|U[0,∞)(α, β; X(ω))|] = E[U[0,∞)(α, β, X(ω))] = lim E[U[0,n](α, β; X(ω))] ≤ < ∞. n→∞ β − α

1 since U[0,n] ≥ 0, ∀n ≥ 1 implies U[0,∞) ≥ 0. Thus, U[0,∞)(α, β; X(ω)) ∈ L (Ω, F,P ), and so

U[0,∞)(α, β; X(ω)) is finite a.e.

Now, the events Aα,β = {ω ∈ Ω: U[0,∞)(α, β; X(ω)) = ∞}, −∞ < α < β < ∞ must have mea- sure zero since α, β in the above were arbitrary, and U[0,∞)(α, β; X(ω)) is finite a.e. Thus, the event A = S A must also have measure zero. α<β,α,β∈Q α,β

∗ Claim 4 Let X and X∗ denote the limits superior and inferior of {Xt}t≥0. If

∗ E = {ω ∈ Ω: X (ω) > X∗(ω)}, then E is a measurable subset of A.

Proof of Claim 4:

If ω ∈ E, then

lim inf Xt(ω) < α < β < lim sup Xt(ω) t→∞ t→∞ for some α, β > 0. So:

β < inf sup Xs(ω) ⇒ β < sup Xs(ω) t≥0 s≥t s≥t and

sup inf Xs(ω) < α ⇒ inf Xs(ω) < α t≥0 s≥t s≥t for all t ≥ 0.

9 Now, in the notation of the above discussion involving upcrossings in the continuous-time case, let τ1(ω) be given.

With t = τ1(ω), β < sups≥τ1(ω) Xs(ω) implies that we can choose s1 > τ1(ω) such that

0 < sup Xs(ω) − Xs1 (ω) < sup Xs(ω) − β ⇒ β < Xs1 (ω) < sup Xs(ω). s≥τ1(ω) s>τ1(ω) s≥τ1(ω)

So, either σ1(ω) = s1 or σ1(ω) < s1. Thus, σ1(ω) ≤ s1, so that there is at least one upcrossing on

[0, s1].

Now, take t = s1. Then, infs≥s1 Xs(ω) < α implies that we can choose s2 > s1 such that

0 < Xs2 (ω) − inf Xs(ω) < α − inf Xs(ω) ⇒ inf Xs(ω) < Xs2 (ω) < α. s≥s1 s≥s1 s≥s1

So, either τ2(ω) = s2 or τ2(ω) < s2. Thus, τ2(ω) ≤ s2, so that there is at least one downcrossing on

[s1, s2].

We can repeat the above procedure with t = τ2(ω) and so on to obtain a sequence τ1(ω) = s0 < s1 < s2 < ··· such that there is at least one upcrossing on [s2k, s2k+1] for k = 0, 1,.... Taking Sj Hj = k=0 sk, we have that |Hj| < ∞ ∀j > 0 and UHj (α, β; X(ω)) → ∞ as j → ∞. Since

U[0,∞)(α, β; X(ω)) is by definition the supremum of all such functions over all finite subsets of [0, ∞), this implies that U[0,∞)(α, β; X(ω)) = ∞, so that ω ∈ Aα,β ⊆ A. Since ω ∈ E was arbitrary, E ⊆ A.

Clearly, by the definition of E, it is measurable, so that the result follows. 4

Since a measurable subset of a set of measure zero has measure zero, P (E) = 0 by Claim 4.

Thus, for almost every ω ∈ Ω, X∞(ω) = limt→∞ Xt(ω) exists. Since {Xt}t≥0 is a submartingale,

+ E[|Xt|] = 2E[Xt ] − E[Xt] ≤ 2C − E[X0] < ∞,

+ so that supt≥0 E[Xt ] < ∞ is equivalent to the stronger assumption supt≥0 E[|Xt|] < ∞.

Thus, by Fatou’s lemma:

1 E[|X∞|] ≤ lim inf E[|Xn|] ≤ sup E[|Xt|] < ∞ ⇒ X∞ ∈ L (Ω, F,P ).  n→∞ t≥0

10 ∞ Theorem 8 (The Optional Stopping Theorem): Let {Xt}t≥0 be a right continuous submartingale with respect to {Ft}t≥0 with last element X∞, and let S ≤ T be two optional times of the filtration {Ft}. We have

E[XT |FS+] ≥ XS, a.s.

If S is a stopping time, then FS can replace FS+ above. In particular, E[XT ] ≥ E[X0], and for a martingale with last element, we have E[XT ] = E[X0].

Proof:

Consider the sequence

 S(ω) S(ω) = +∞ Sn(ω) = k k−1 k  2n 2n ≤ S(ω) < 2n and

 T (ω) T (ω) = +∞ Tn(ω) = k k−1 k  2n 2n ≤ T (ω) < 2n defined for n ≥ 1. Clearly, these are random variables taking values in [0, ∞].

∞ ∞ Claim 5 {Sn}n=1, {Tn}n=1 are stopping times, and Sn ≤ Tn ∀n. Furthermore, each of the sequences are decreasing, and limn→∞ Sn = S , limn→∞ Tn = T .

Proof of Claim 5:

Let n ≥ 1, n ∈ Z be fixed, and suppose t ≥ 0 is given. Then,  k − 1 k k  {ω ∈ Ω: S (ω) ≤ t} = ω ∈ Ω: ≤ S(ω) < ≤ t n 2n 2n 2n

u [  k − 1 k  n u o = ω ∈ Ω: ≤ S(ω) < = ω ∈ Ω : 0 ≤ S(ω) < ∈ F u ⊆ Ft 2n 2n 2n 2n k=1 where u denotes the greatest integer such that u ≤ t2n.

∞ Since t and n were arbitrary, this shows that {Sn}n=1 are stopping times.

∞ ∞ Clearly, the same argument holds for {Tn}n=1, so that {Tn}n=1 are stopping times.

Now, suppose that Sn(ω) > Tn(ω) for some ω ∈ Ω; WLOG, WMA that neither takes the value ∞. k k0 k k0 0 Then, if Sn(ω) = 2n and Tn(ω) = 2n we must have 2n > 2n , or in other words k > k .

11 But, since  k − 1 k  ω ∈ ω ∈ Ω: ≤ S(ω) < 2n 2n and  k0 − 1 k0  ω ∈ ω ∈ Ω: ≤ T (ω) < 2n 2n this implies that k0 − 1 k0 k − 1 k ≤ T (ω) < ≤ ≤ S(ω) < 2n 2n 2n 2n which implies that T (ω) < S(ω) which is a contradiction.

Thus, Sn(ω) ≤ Tn(ω), so that since ω ∈ Ω was arbitrary, Sn(ω) ≤ Tn(ω) ∀ω ∈ Ω. Thus, Sn ≤ Tn, so that since n ≥ 1 was arbitrary, this holds ∀n ≥ 1.

It’s clear by the construction of {Sn} and {Tn} that the sequences are decreasing.

To show the final statement, note that there’s nothing to show for T (ω) = ∞.

For T (ω) < ∞, we have by construction of {Tn} that k − 1 k 1 ≤ T (ω) < = Tn(ω) ⇒ Tn(ω) − T (ω) ≤ ⇒ lim Tn(ω) = T (ω). 2n 2n 2n n→∞

Clearly, the same holds for {Sn}.

4

So, by Claim 5, Sn and Tn are stopping times with Sn ≤ Tn. Furthermore, each Sn and Tn take on countably many values, so that by the discrete-time optional sampling theorem, Z Z

XSn ≤ E[XTn |FSn ] ⇔ XSn dP ≤ XTn dP A A for every A ∈ FSn .

Now, it’s not too hard to show

T∞ Claim 6 FS+ = n=1 FSn ⊆ FSn

Thus, Z Z

XSn dP ≤ XTn dP A A for every A ∈ FS+.

12 Now assume that S is a stopping time. By construction of {Sn}, S ≤ Sn for all n. Let A ∈ FS. Then

A ∩ {S ≤ t} ∈ Ft ∀t ≥ 0 and hence

A ∩ {Sn ≤ t} = A ∩ {Sn ≤ t} ∩ {S ≤ t} ∈ Ft ∀t ≥ 0, ∀n ≥ 1

Thus, FS ⊆ FSn for each n ≥ 1, so that FS ⊂ FS+ by Claim 6. So, Z Z

XSn dP ≤ XTn dP A A for every A ∈ FS.

∞ Claim 7 Let {Fn}n=1 be a decreasing sequence of sub σ-fields of F, and let {Xn} be a backwards submartingale w.r.t {Fn} [that is, E[|Xn|] < ∞, Xn is Fn measurable, and E[Xn|Fn+1] ≥ Xn+1 for ∞ ever n ≥ 1]. Then, l := limn→∞ E(Xn) > −∞ implies that {Xn}n=1 is uniformly integrable.

By construction, {XSn } is a backward submartingale w.r.t. {FSn }. Thus, {E[XSn ]} is decreasing, and bounded below by E[X∞]. Thus, by Claim 7, {XSn } is uniformly integrable.

Similarly, the same is true of {Tn}.

Since the process {Xt}t≥0 is right-continuous, Claim 5 implies

lim XT (ω) = XT (ω), lim XS (ω) = XS(ω) n→∞ n n→∞ n a.e. ω ∈ Ω [Note: we’re assuming that S, T < ∞ a.e.]

By uniform integrability, it follows that XT and XS are integrable. and that the above a.e. convergence is actually L1 convergence. R R Thus, A XSdP ≤ A XT dP for every A ∈ FS+, and A ∈ FS when S is a stopping time. .

13