Elect. Comm. in Probab. 15 (2010), 99–105 ELECTRONIC COMMUNICATIONS in PROBABILITY

THE MEASURABILITY OF HITTING TIMES

RICHARD F. BASS1 Department of Mathematics, University of Connecticut, Storrs, CT 06269-3009 email: [email protected] Submitted January 18, 2010, accepted in final form March 8, 2010 AMS 2000 Subject classification: Primary: 60G07; Secondary: 60G05, 60G40 Keywords: Stopping time, hitting time, progressively measurable, optional, predictable, debut theorem, section theorem

Abstract Under very general conditions the hitting time of a set by a is a stopping time. We give a new simple proof of this fact. The section theorems for optional and predictable sets are easy corollaries of the proof.

1 Introduction

A fundamental theorem in the foundations of stochastic processes is the one that says that, under very general conditions, the first time a stochastic process enters a set is a stopping time. The proof uses capacities, analytic sets, and Choquet’s capacibility theorem, and is considered hard. To the best of our knowledge, no more than a handful of books have an exposition that starts with the definition of capacity and proceeds to the hitting time theorem. (One that does is [1].) The purpose of this paper is to give a short and elementary proof of this theorem. The proof is simple enough that it could easily be included in a first year graduate course in probability. In Section 2 we give a proof of the debut theorem, from which the measurability theorem follows. As easy corollaries we obtain the section theorems for optional and predictable sets. This argument is given in Section 3.

2 The debut theorem

Suppose (Ω, , P) is a . The outer probability P∗ associated with P is given by F

P∗(A) = inf P(B) : A B, B . { ⊂ ∈ F } A set A is a P-null set if P(A) = 0. Suppose t is a filtration satisfying the usual conditions: {F } ">0 t+" = t for all t 0, and each t contains every P-null set. Let π : [0, ) Ω Ω be ∩definedF by π(Ft, ω) = ω. ≥ F ∞ × →

1RESEARCH PARTIALLY SUPPORTED BY NSF GRANT DMS-0901505

99 100 Electronic Communications in Probability

Recall that a taking values in [0, ] is a stopping time if (T t) t for all t; we allow our stopping times to take the value infinity.∞ Since the filtration satisfies≤ ∈ F the usual conditions, T will be a stopping time if (T < t) t for all t. If Ti is a finite collection or countable collection of stopping times, then supi Ti ∈and F infi Ti are also stopping times. Given a topological space , the Borel σ-field is the one generated by the open sets. Let [0, t] S B denote the Borel σ-field on [0, t] and [0, t] t the product σ-field. A process X taking values B × F in a topological space is progressively measurable if for each t the map (s, ω) Xs(ω) from S → [0, t] Ω to is measurable with respect to [0, t] t , that is, the inverse image of Borel × S B × F subsets of are elements of [0, t] t . If the paths of X are right continuous, then X is easily seen to beS progressively measurable.B × The F same is true if X has left continuous paths. A subset of [0, ) Ω is progressively measurable if its indicator is a progressively measurable process. ∞ × If E [0, ) Ω, let DE = inf t 0 : (t, ω) E , the debut of E. We will prove ⊂ ∞ × { ≥ ∈ } Theorem 2.1. If E is a progressively measurable set, then DE is a stopping time. 0 Fix t. Let (t) be the collection of subsets of [0, t] Ω of the form K C, where K is a compact K × × 0 subset of [0, t] and C t . Let (t) be the collection of finite unions of sets in (t) and let ∈ F K K δ(t) be the collection of countable intersections of sets in (t). We say A [0, t] t is K K ∈ B × F t-approximable if given " > 0, there exists B δ(t) with B A and ∈ K ⊂ P∗(π(A)) P∗(π(B)) + ". (2.1) ≤ Lemma 2.2. If B δ(t), then π(B) t . If Bn δ(t) and Bn B, then π(B) = nπ(Bn). ∈ K ∈ F ∈ K ↓ ∩ The hypothesis that the Bn be in δ(t) is important. For example, if Bn = [1 (1/n), 1) Ω, then K − × π(Bn) = Ω but π( nBn) = . This is why the proof given in [2, Lemma 6.18] is incorrect. ∩ ; Proof. If B = K C, where K is a nonempty subset of [0, t] and C t , then π(B) = C t . Therefore π B × if B 0 t . If B m A with A 0 t , then∈ Fπ B m π A ∈ F. ( ) t ( ) = i=1 i i ( ) ( ) = i=1 ( i) t For each ω and∈ each F set C∈, Klet ∪ ∈ K ∪ ∈ F

S(C)(ω) = s t : (s, ω) C . (2.2) { ≤ ∈ } If B δ(t) and Bn B with Bn (t) for each t, then S(Bn)(ω) S(B)(ω), so S(B)(ω) is compact.∈ K ↓ ∈ K ↓ Now suppose B δ(t) and take Bn B with Bn δ(t). S(Bn)(ω) is a compact subset of ∈ K ↓ ∈ K [0, t] for each n and S(Bn)(ω) S(B)(ω). One possibility is that nS(Bn)(ω) = ; in this case, ↓ ∩ 6 ; if s nS(Bn)(ω), then (s, ω) Bn for each n, and so (s, ω) B. Therefore ω π(Bn) for each ∈ ∩ ∈ ∈ ∈ n and ω π(B). The other possibility is that nS(Bn)(ω) = . Since the sequence S(Bn)(ω) ∈ ∩ ; c is a decreasing sequence of compact sets, S(Bn)(ω) = for some n, for otherwise S(Bn)(ω) ; { } would be an open cover of [0, t] with no finite subcover. Therefore ω / π(Bn) and ω / π(B). We ∈ ∈ conclude that π(B) = nπ(Bn). ∩ Finally, suppose B δ(t) and Bn B with Bn (t). Then π(B) = nπ(Bn) t . ∈ K ↓ ∈ K ∩ ∈ F

Proposition 2.3. Suppose A is t-approximable. Then π(A) t . Moreover, given " > 0 there exists ∈ F B δ(t) such that P(π(A) π(B)) < ". ∈ K \

Proof. Choose An δ(t) with An A and P(π(An)) P∗(π(A)). Let Bn = A1 An and ∈ K ⊂ → ∪ · · · ∪ let B = nBn. Then Bn δ(t), Bn B, and P(π(Bn)) P(π(An)) P∗(π(A)). It follows that ∪ ∈ K ↑ ≥ → π(Bn) π(B), and so π(B) t and ↑ ∈ F P(π(B)) = lim P(π(Bn)) = P∗(π(A)). Hitting times 101

For each n, there exists Cn such that π(A) Cn and P(Cn) P∗(π(A)) + 1/n. Setting ∈ F ⊂ ≤ C = nCn, we have π(A) C and P∗(π(A)) = P(C). Therefore π(B) π(A) C and P(π(B)) = ∩ ⊂ ⊂ ⊂ P∗(π(A)) = P(C). This implies that π(A) π(B) is a P-null set, and by the completeness assumption, \ π(A) = (π(A) π(B)) π(B) t . Finally, \ ∪ ∈ F lim P(π(A) π(Bn)) = P(π(A)) π(B)) = 0. n \ \

We now prove Theorem 2.1. Proof of Theorem 2.1. Fix t. Let

= A [0, t] t : A is t-approximable . M { ∈ B × F } We show is a monotone class. Suppose An , An A. Then π(An) π(A). By Proposition M ∈ M ↑ ↑ 2.3, π(An) t for all n, and therefore π(A) t and P(π(A)) = limn P(π(An)). Choose n large ∈ F ∈ F so that P(π(A)) < P(π(An)) + "/2. Then choose Kn δ(t) such that Kn An and P(π(An)) < ∈ K ⊂ P(π(Kn)) + "/2. This shows A is t-approximable. Now suppose An and An A. Choose Kn δ(t) such that Kn An and P(π(An) π(Kn)) < n+1 ∈ M ↓ ∈ K ⊂ \ "/2 . Let Ln = K1 Kn, L = nKn. Since each Kn δ(t), so is each Ln, and hence ∩ · · · ∩ ∩ ∈ K L δ(t). Also Ln L and L A. By Lemma 2.2, π(Ln) π(L), hence P(π(Ln)) P(π(L)). ∈ K ↓ ⊂ ↓ → Therefore P(π(Ln)) < P(π(L)) + "/2 if n is large enough. We write

P(π(A)) P(π(An)) P(π(An) π(Ln)) + P(π(Ln)) ≤ n π ≤A π K \ π L P( i=1( ( i) ( i))) + P( ( n)) ≤ n ∪ \ X P(π(Ai) π(Ki)) + P(π(Ln)) ≤ i=1 \ < " + P(π(L)) if n is large enough. Therefore A is t-approximable. 0 If (t) is the collection of sets of the form [a, b) C, where a < b t and C t , and (t) 0 isI the collection of finite unions of sets in (t), then× (t) is an algebra≤ of sets.∈ F We noteI that I 0 I (t) generates the σ-field [0, t] t . A set in (t) of the form [a, b) C is the union of sets 0 inI (t) of the form [a, bB (1/m)]× F C, and itI follows that every set in× (t) is the increasing unionK of sets in (t). Since− is a monotone× class containing (t), thenI contains (t). By K M K M I the monotone class theorem, = [0, t] t . By Proposition 2.3, if A [0, t] t , then M B × F ∈ B × F π(A) t . ∈ F Now let E be a progressively measurable and let A = E ([0, t] Ω). We have (DE t) = π(A) t . Because t was arbitrary, DE is a stopping time. ∩ × ≤ ∈ F If B is a Borel subset of a topological space , let S UB = inf t 0 : X t B { ≥ ∈ } and TB = inf t > 0 : X t B , { ∈ } the first entry time and first hitting time of B, resp. Here is the measurability theorem. 102 Electronic Communications in Probability

Theorem 2.4. If X is a progressively measurable process taking values in and B is a Borel subset

of , then UB and TB are stopping times. S S

Proof. Since B is a Borel subset of and X is progressively measurable, then 1B(X t ) is also S progressively measurable. UB is then the debut of the set E = (s, ω) : 1B(Xs(ω)) = 1 , and therefore is a stopping time. { } δ δ δ δ If we let Yt = X t+δ and UB = inf t 0 : Yt B , then by the above, UB is a stopping time with δ { ≥ δ ∈ } δ respect to the filtration t , where t = t+δ. It follows that δ + UB is a stopping time with {F } F F 1/m respect to the filtration t . Since (1/m) + UB TB, then TB is a stopping time with respect to t as well. {F } ↓ {F } We remark that in the theory of Markov processes, the notion of completion of a σ-field is a bit µ different. In that case, we suppose that t contains all sets N such that P (N) = 0 for every starting µ. The proof in PropositionF 2.3 shows that

µ (P )∗(π(A) π(B)) = 0 \ µ for every starting measure µ, so π(A) π(B) is a P -null set for every starting measure µ. Therefore \ π(A) = π(B) (π(A) π(B)) t . With this modification, the rest of the proof of Theorem 2.1 goes through∪ in the Markov\ process∈ F context.

3 The section theorems

Let (Ω, , P) be a probability space and let t be a filtration satisfying the usual conditions. The optionalF σ-field is the σ-field of subsets{F of} [0, ) Ω generated by the set of maps X : O ∞ × [0, ) Ω R where X is bounded, adapted to the filtration t , and has right continuous paths.∞ × The→ predictable σ-field is the σ-field of subsets of [0,{F }) Ω generated by the set P ∞ × of maps X : [0, ) Ω R where X is bounded, adapted to the filtration t , and has left continuous paths.∞ × → {F } Given a stopping time T, we define [T, T] = (t, ω) : t = T(ω) < . A stopping time is predictable if there exist stopping times T1, T2, .{ . . with T1 T2 , T∞}n T, and on the event ≤ ≤ · · · ↑ (T > 0), Tn < T for all n. We say the stopping times Tn predict T. If T is a predictable stopping time and S = T a.s., we also call S a predictable stopping time. The optional section theorem is the following.

Theorem 3.1. If E is an optional set and " > 0, there exists a stopping time T such that [T, T] E and P(π(E)) P(T < ) + ". ⊂ ≤ ∞ The statement of the predictable section theorem is very similar.

Theorem 3.2. If E is a predictable set and " > 0, there exists a predictable stopping time T such that [T, T] E and P(π(E)) P(T < ) + ". ⊂ ≤ ∞ First we prove the following lemma.

Lemma 3.3. (1) is generated by the collection of processes 1C (ω)1[a,b)(t) where C a. O ∈ F (2) is generated by the collection of processes 1C (ω)1[b,c)(t) where C a and a < b < c. P ∈ F Hitting times 103

Proof. (1) First of all, 1C (ω)1[a,b)(t) is a bounded right continuous , so it is optional.

Let 0 be the σ-field on [0, ) Ω generated by the collection of processes 1C (ω)1[a,b)(t), where O ∞ × C a. Letting b , 0 includes sets of the form [a, ) C with C a. ∈ F → ∞ O ∞ × ∈ F Let X t be a right continuous, bounded, and adapted process and let " > 0. Let U0 = 0 and define U inf t > U : X X > " . Since U < t X X > " , where the union is over all i+1 = i t Ui ( 1 ) = ( q 0 ) rational q {less than|t, U−1 is a| stopping} time, and an analogous∪ | − argument| shows that each Ui is also a stopping time. If S and T are stopping times, let 1[S,T) = (t, ω) [0, ) Ω : S(ω t < T(ω) . If we set { ∈ ∞ × ≤ } X X " ω ∞ X ω 1 t , t ( ) = Ui ( ) [Ui ,Ui+1)( ) i=0 " then supt 0 X t X t ". Therefore we can approximate X by processes of the form ≥ | − | ≤ X X ∞ X 1 ∞ X 1 . Ui [Ui , ) Ui [Ui+1, ) ∞ ∞ i=0 − i=0

It therefore suffices to show that if V is a stopping time and A V , then 1A(ω)1[V, )(t) is 0 measurable. ∈ F ∞ O n n n Letting Vn = (k + 1)/2 when k/2 V < (k + 1)/2 , ≤

1A ω 1 V ω , t lim 1A ω 1 V ω , t ( ) [ ( ) )( ) = n ( ) [ n( ) )( ) ∞ ∞ →∞ X∞ lim 1A V k 1 /2n 1 k 1 /2n, t , = n ( n=( + ) ) [( + ) )( ) ∩ ∞ →∞ k=0 which is 0 measruable. O (2) As long as a + (1/n) < b, the processes 1C (ω)1(b (1/n),c (1/n)](t) are left continuous, bounded, − − and adapted, hence predictable. The process 1C (ω)1[b,c)(t) is the limit of these processes as n , so is predictable. On the other hand, if X t is a bounded adapted left continuous process, it→ can ∞ be approximated by

X∞ n n n X(k 1)/2 (ω)1(k/2 ,(k+1)/2 ](t). − k=1

Each summand can be approximated by linear combinations of processes of the form 1C (ω)1(b,c](t), where C a and a < b < c. Finally, 1C 1(b,c] is the limit of 1C (ω)1[b+(1/n),c+(1/n))(t) as n . ∈ F → ∞ A consequence of this lemma is that . Since is generated by the class of right continuous processes and right continuous processesP ⊂ O are progressivelyO measurable, we have from Theorem 2.1 that the debut of a predictable or optional set is a stopping time. Fix t and define (t) = A ([0, t] Ω) : A . 0 O { ∩ × ∈ O } Let (t) be the collection of subsets of (t) of the form K C, where K is a compact subset of K O × [0, t] and C a with a inf s : s K . Let (t) be the collection of finite unions of sets in 0 ∈ F ≤ { ∈ } K 0 (t) and δ(t) the collection of countable intersections of sets in (t). Define (t) to be K K K I the collection of sets of the form [a, b) C, where a < b t and C a, and let (t) be the 0 × ≤ ∈ F I collection of finite unions of sets in (t). I 104 Electronic Communications in Probability

The proof of the following proposition is almost identical to the proof of Theorem 2.1. Because the debut of optional sets is now known to be a stopping time, it is not nececessary to work with

P∗.

Proposition 3.4. Suppose A (t). Then given " > 0, there exists B δ(t) such that P(π(A) π(B)) < ". ∈ O ∈ K \ We now prove Theorem 3.1.

Proof of Theorem 3.1. If E is an optional set, choose t large enough so that if At = E ([0, t] Ω), ∩ × then P(π(At )) > P(π(E)) "/2. This is possible because At E and so π(At ) π(E). With this − ↑ ↑ value of t, choose B δ(t) such that B At and P(π(B)) > P(π(At )) "/2. We will show ∈ K ⊂ − [DB, DB] B. Since (DB < ) = π([DB, DB]) = π(B), we have [DB, DB] E and ⊂ ∞ ⊂ P(π(E)) < P(π(At )) + "/2 < P(π(B)) + " = P(π([DB, DB])) + ".

By the argument of the proof of Lemma 2.2, S(B)(ω) is a compact set if B δ(t). Therefore ∈ K DB(ω) = inf s : s S(B)(ω) is in S(B)(ω), which implies [DB, DB] B. { ∈ } ⊂ To prove Theorem 3.2 we follow along the same lines. Define

(t) = A ([0, t] Ω) : A 0 P { ∩ × ∈ P } and define f (t) to be the collection of subsets of (t) of the form K C, where K is a compact K P × subset of [0, t] and C a with a < inf s : s K , let f(t) be the collection of finite unions of 0 ∈ F { ∈ } K 0 sets in f (t), and fδ(t) the collection of countable intersections of sets in f(t). Define e (t) K K K I to be the collection of sets of the form [b, c) C, where C a and a < b < c t, and let e(t) 0 be the collection of finite unions of sets in e×(t). Following∈ the F proof of Theorem≤ 3.1, we willI be I done once we show DB is a predictable stopping time when B fδ(t). 0 ∈ K Proof of Theorem 3.2. Fix t. Suppose B f (t) is of the form B = K C with C a and ∈ K × ∈ F a < b = inf s : s K . Note that this implies b > 0. Then DB equals b if ω C and equals infinity { ∈ } ∈ otherwise. As long as a + (1/m) < b, we see that DA is predicted by the stopping times Vm, where V equals b 1/m if ω C and equals m otherwise. Note also that D , D B. If B m B m ( ) [ B B] = i=1 i with B −0 t , then D ∈ D D , and it is easy to see that D is predictable⊂ because∪ i f ( ) B = B1 Bm B each D ∈is,K and also that D , D ∧ · ·B ·. ∧ Bi [ B B] Now let B t with B B and⊂B t . We have D , and the limit, which we call T, will fδ( ) n n f( ) Bn be a stopping∈ K time. Since B↓ B , then∈DK D , and therefore↑ T D . Each D is a predictable n Bn B B Bn stopping time. Let R be stopping⊂ times predicting≤ D and choose≤ m large so that nm Bn n R 2 n < D < < 2 n, R < n, D < 2 n. P( nmn + − Bn ) − P( nmn Bn = ) − ∞ ∞ By the Borel-Cantelli lemma, sup R < T < 0 and sup R < T 0, P( nmn ) = P( nmn = ) = n ∞ n ∞ so if we set Q n R R , we see that Q is a sequence of stopping times predicting n = ( 1m1 nmn ) n T, except for a set∧ of probability∨· · ·∨ zero. Hence T is a{ predictable} stopping time. If n > m, then D , D B B . Since S B ω is a closed subset of t, the facts that [ Bn Bn ] n m ( m)( ) D ω S B ω for n >⊂m and⊂D ω T ω for each ω shows that T ω S B ω for Bn ( ) ( m)( ) Bn ( ) ( ) ( ) ( m)( ) ∈ → ∈ each ω. Thus [T, T] Bm. This is true for all m, so [T, T] B. In particular, T DB, so T = DB. ⊂ ⊂ ≥ Therefore π(B) = (DB < ) = π([T, T]). This and the argument of∞ the first paragraph of the proof of Theorem 3.1 proves Theorem 3.2. Hitting times 105

References

[1] C. Dellacherie and P.-A.Meyer. Probabilities and Potential. North-Holland, Amsterdam, 1978. MR0521810

[2] R.J. Elliott. Stochastic Calculus and Applications. Springer, New York, 1982. MR0678919