Quick viewing(Text Mode)

Connection Between Martingale Problems and Markov Processes

IMT Toulouse

Skript: Connection between Martingale Problems and Markov Processes

Author Franziska K¨uhn

Info IRTG Summer School Bielefeld, August 2019 Contents

Index of Notation 3

1 Introduction 4

2 Martingale Problems5 2.1 Existence of solutions...... 5 2.2 Uniqueness ...... 6 2.3 ...... 7

3 Connection between SDEs and martingale problems8

4 Markov processes 11 4.1 Semigroups ...... 11 4.2 Infinitesimal generator...... 12 4.3 From Markov processes to martingale problems ...... 13

Bibliography 15

2 Index of Notation

∞ d d Probability/measure theory Cc (R ) space of smooth functions f ∶ R → R with compact support d ∼, = distributed as D[0, ∞) Skorohod space = space of c`adl`ag d PX distribution of X w.r.t. P functions f ∶ [0, ∞) → R B( d) Borel σ-algebra on d R R Analysis δx Dirac measure centered at x ∈ R ∧ minimum Spaces of functions Y ⋅ Y∞ uniform norm d Bb(R ) space of bounded Borel measurable D(A) domain of operator A d functions f ∶ R → R c`adl`ag finite left limits and right-continuous d Cb(R ) space of bounded continuous tr(A) trace of A d functions f ∶ R → R ∇2f Hessian of f d C∞(R ) space of continuous functions d f ∶ R → R vanishing at infinity limSxS→∞ Sf(x)S = 0

3 1 Introduction

Central question: How to characterize stochastic processes in terms of martingale properties? Start with two simple examples: and Poisson process.

1.1 Definition A (Bt)t≥0 is a Brownian motion if

• B0 = 0 ,

• Bt1 −Bt0 ,...,Btn −Btn−1 are independent for all 0 = t0 < t1 < ... < tn (independent increments),

• Bt − Bs ∼ N(0, (t − s) id) for all s ≤ t,

• t ↦ Bt(ω) is continuous for all ω (continuous sample paths)

1.2 Theorem (L´evy) Let (Xt)t≥0 be a 1-dim. stochastic process with continuous sample paths and 2 X0 = 0. (Xt)t≥0 is a Brownian motion if, and only, if (Xt)t≥0 and (Xt −t)t≥0 are (local) martingales. Beautiful result! Only 2 martingales needed to describe all the information about Brownian motion.

Remark Assumption on continuity of sample paths is needed (consider Xt = Nt − t for Poisson process with intensity 1).

1.3 Definition A stochastic process (Nt)t≥0 is a Poisson process with intensity λ > 0 if

• N0 = 0 almost surely,

• (Nt)t≥0 has independent increments,

• Nt − Ns ∼ Poi(λ(t − s)) for all s ≤ t,

• t ↦ Nt(ω) is c`adl`ag(=right-continuous with finite left-hand limits).

1.4 Theorem (Watanabe) Let (Nt)t≥0 be a counting process, i.e. a process with N0 = 0 which is constant except for jumps of height +1. Then (Nt)t≥0 is a Poisson process with intensity λ if and only if (Nt − λt)t≥0 is a martingale. Outline of this lecture series:

• Set up general framework to describe processes via martingales (→ martingale problems, ▸ §2) • study connection between martingale problems and Markov processes

• application: study solutions to stochastic differential equations ▸ §3)

4 2 Martingale Problems

2.1 Definition Let (Ω, F, P) be a .

(i).A filtration (Ft)t≥0 is a family of sub-σ-algebras of F such that Fs ⊆ Ft for s ≤ t.

(ii). A stochastic process (Mt)t≥0 is a martingale (with respect to (Ft)t≥0 and P) if

• (Mt, Ft)t≥0 is an , i.e. Xt is Ft-measurable for all t ≥ 0, 1 • Mt ∈ L (P) for all t ≥ 0, • E(Mt S Fs) = Ms for all s ≤ t. d Standing assumption: processes have c`adl`agsample paths t ↦ Xt(ω) ∈ R . Skorohod space:

d D[0, ∞) ∶= {f ∶ [0, ∞) → R ; f c`adl`ag }.

d d Fix a linear operator L ∶ D → Bb(R ) with domain D ⊆ Bb(R ). Notation: (L, D). d 2.2 Definition Let µ be a probability measure on R . A (c`adl`ag) stochastic process (Xt)t≥0 on a µ probability space (Ω, F, P ) is a solution to the (L, D)-martingale problem with initial distribution µ µ if P (X0 ∈ ⋅) = µ(⋅) and t f Mt ∶= f(Xt) − f(X0) − Lf(Xs) ds S0 µ is for any f ∈ D a martingale w.r.t to the canonical filtration Ft ∶= σ(Xs; s ≤ t) and P .

x δx x x 2.3 Remark (i). For µ = δx we write P instead of P . Moreover, E ∶= ∫ dP . (ii). Sometimes it is convenient to fix the measurable space. On chooses Ω = D[0, ∞) Skorohod space and X(t, ω) = ω(t) canonical process (possible WLOG, cf. Lemma 2.9 below). Next: existence and uniqueness of solutions. Rule of thumb: Existence is much easier to prove than uniqueness.

2.1 Existence of solutions

2.4 Definition A linear operator (L, D) satisfies the positive maximum principle if

f ∈ D, f(x0) = sup f(x) ≥ 0 Ô⇒ Lf(x0) ≤ 0. d x∈R

d d d 2.5 Lemma Let (L, D) be a linear operator on Cb(R ) (i.e. D ⊆ Cb(R ) and L(D) ⊆ Cb(R )). If d there exists for any x ∈ R a solution to the (L, D)-martingale problem with initial distribution µ = δx, then (L, D) satisfies the positive maximum principle. d Proof. Fix f and x with f x sup d f x 0. Let X be a solution with ∈ D 0 ∈ R ( 0) = x∈R ( ) ≥ ( t)t≥0 X0 = x0. Then t x0 x0 0 ≥ E (f(Xt)) − f(x0) = E ‹ Lf(Xs) ds , S0

5 DIY Remark 2.7 measure distribution probability initial every for exists there Then . Theorem 2.6 . Lemma 2.9 by Denote of right-continuity the from follows it therefore and o any for distribution initial (with problem hnreplace then proof. of Idea ( If hr o each for where nte tcatcpoes(osbyo ieetpoaiiyspace probability different distribution a in on (possibly process stochastic another isfying rwinmotion Brownian general, In Mind: context? this in notion proper the Definition is 2.8 What uniqueness. of notion introduce Next: Uniqueness 2.2 on conditions sufficient Find o n esrbesets measurable any for teeeit sequence a exists there (iii). (ii). L, (i). ( Y D t ) ) L o all for D en noperator an Define mrigl rbe ihiiildistribution initial with problem -martingale t ≥ ∫ t aifistepstv aiu rnil on principle maximum positive the satisfies ⊆ 0 1 min C saohrcala rcs uhthat c`adl`ag such another process is < C Let ∞ x . . . ∞ ([4]) X { f(i)fist odte hr xssaslto hc ih xld nfiietime. finite in explode might which solution a exists there then hold to fails (iii) If ( 1 ∈ R ( c`adl`agA process x ( Let , ( R by R < d X S X ∈ ˜ y ) d d ups that Suppose t Lf S t ( ) R t X 2 k ) sdnein dense is Y µ ) X and ( } t d h pc fcniuu ucin aihn tifiiy lim infinity, at vanishing functions continuous of space the t t X . ≤ ≥ ≥ ( .) t , ν 0 0 ) x = P d b t ( s P t ) ) ( fbt rcse aetesm nt iesoa itiuin,i.e. distributions, dimensional finite same the have processes both if ( ,dy x, ≥ easlto othe to solution a be X µ t B ≤ x X ˜ 0 ∶ ≥ = ⎝ ⎛ ) L t 0 i t t and +  1 b ∈ o all for n n times any and easohsi rcs napoaiiyspace probability a on process stochastic a be ntesot ucin ihcmatsupport compact with functions smooth the on and f ) ( ∈ S R x Lf b ( ( < B C , y f Z ) d X ≠ ∞ ∇ ˜ n ∞ , Q 1 t ( 0 ( h t ) ) Q n X , . . . , ( f x ( lim , Z hwthat Show . n j → ∶ t R µ − f 0 ( = ( ν ∈ t ∞ ) x iff ) ≥ ∈ x N ( d ) f √ x ) i applicable.) is 2.6 Theorem that such ) t ) = B does 0 ( f ⊆ ≥ + sup tX + Z ∈ n 0 n and b lim t D → t ≥ ( ( s R y n 2 1 wt nta distribution initial (with 1 R x ) 1 0 ) d (Y tr uhthat such . ) ∈ d − t 1 × t − L i ) ( B = d f ( Notation: . S S hsi nasrino h d’!(Put fdd’s! the on assertion an is This . not Q f n L, ∶ spstv eient and semidefinite positive is n n lim and 1 s 0 ( Y D L ( ) ( t x ∞ t µ x D X Lf = 6 E ) aifistepstv aiu rnil.(Extra: principle. maximum positive the satisfies → ) imply ouint the to solution a ) + t ∇ x − P ˜ mrigl rbe ihiiildistribution initial with problem -martingale ) 0 ( C Y ( 2 t y µ ( Z D s ≥ Lf X f ∞ Lf ˜ 0 . ∇ ⋅ r ↦ ( , t ( ) x n = 1 d n ( R ( f dr Y ( )) X E → ∈ X ( ∞ X d ( x ∞ Y B t )  x s 0 ) ) t t , j ) )) M Lf Lf ) 1 t ) < = k ≥ 1 t , . . . , t 1 ≥ 0 ≥ ds ( ∞ n 0 0 ( h 0 = d then , ( X , j = d 1 . x ≤ ( ( ) s X ( µ ( Z ) (S X ˜ ) 0 ˜ X L, ( ovsthe solves ) ˜ t = t y . t Ω i n that ˜ t S)) ) ) 0 D ) t , ∈ ⎠ ⎞ ≥ ( t F ≥ ) ˜ 0 Y B ν = 0 mrigl rbe with problem -martingale osdrfrisac a instance for Consider . ν , t ( . n ) P ˜ ( 0 ,dy x, ) ( t ,dy x, ) ≥ Ω C ,then ), 0 , c ∞ saslto othe to solution a is F S ) x ( ) S ( → , R L, P samauesat- measure a is ∞ d ) ) D ( If . f X by ( ) x -martingale t Z ) ) ( t ≥ X = = ˜ 0 t 0. X equals ) t ≥ and 0 µ is .

Skript: “Connection between Martingale Problems and Markov Processes” by Franziska K¨uhn,IMT Toulouse, August 2019 .0Definition 2.10 definition. following the motivates 2.9 Lemma ( the of uniqueness Proposition the processes. 2.11 Markov from of follows us distributions Reminds finite-dimensional the distributions. of one-dimensional uniqueness result: Next hnuiuns od o n nta distribution initial any for holds uniqueness Then ( ( aifistesml akvproperty Markov simple the satisfies Theorem 2.12 property Markov 2.3 ω Remark 2.13 ocuin atnaepolm r sflto to (e.g. tool analysis useful in a applications are useful problems Has martingale 4.5]. Conclusion: Section [ 4, [8]. e.g. choose cf. see to (2.2), inequalities), possible Harnack of is for sense it the assumptions, in weak Markovian quite is Under solutions. of set framework, Remark 2.14 If (2.1). where Interpretation: The (ii). X X X ( t t t t ) ) ) ) that 2.12 to additionally Assume . t t t Π ≥ ≥ ≥ µ nqeslto othe to solution unique 0 0 0 µ faytosltoshv h aefiiedmninldistributions. finite-dimensional same the have solutions two any if and and saslto othe to solution a is ∶ D = ( { ⊂ L, P ( ( C µ Y Y D i oei re osdrtecnnclfaeok ..Ω i.e. framework, canonical the Consider true. is more bit A b hti hr sn nqees akva eeto!Cnie gi canonical again Consider selection! Markovian uniqueness? no is there if What hoe 4.4.2]) Theorem ([4, ; t t ( ( ( ) ) ) hoe 4.4.2]) Theorem ([4, R X X t t mrigl rbe is problem -martingale ≥ ≥ Uiuns od o the for holds Uniqueness (i). d t t 0 0 E ) , ) µ P othe to ftemrigl rbe ihiiildistribution initial with problem martingale the of t eee e togMro rcs (i.e. process Markov strong a get even we ≥ ( µ 0 f ) saMro rcs ihsemigroup with process Markov a is ( sslto to solution is X s ( + L, t ( ( S ) E L, L, D µ F ( D ) D f mrigl rbe ihiiildistribution initial with problem -martingale t ) sueta the that Assume ) ) ( sueta o n nta distribution initial any for that Assume mrigl rbe ihiiildistribution initial with problem -martingale mrigl rbe ihiiildistribution initial with problem -martingale X = X s E + ( t P t µ L, x = S ) d s ( elposed well f f ↦ Y D F ( ( t ) x X P s mrigl rbe ihiiildistribution initial with problem -martingale ) ) x s ( + ∶ = ( = B t L, 7 o all for S ) ( E ) P D x X µ s smaual o all for measurable is f ffrayiiildistribution initial any for if f ( ) i.e. , ( t L, mrigl rbe ihiiildistribution initial with problem -martingale construct )( ) X ,t s, , X D t s ) ≥ t ) . ) ( mrigl rbe swl oe.If posed. well is problem -martingale ( 0 X P . t t P ) ) µ t t t akvprocesses. Markov ≥ ≥ ≥ as (2.2) -a.s. ↝ 0 0 0 implies (2.2) Note: 4). (chapter = d f , nt tpigtime stopping finite ( Y ∈ t P B µ ) B µ t b . ≥ ( Then . µ ∈ 0 = R Π n n w solutions two any and o n w solutions two any for D d µ ) [ . µ 0 uhthat such µ , thlsthat holds it ∞ . µ µ then , ) hr xssa exists there and τ ). ( X µ ( X X t } t ( (2.1) , t ω ) P ) t µ ≥ = 0 )

Skript: “Connection between Martingale Problems and Markov Processes” by Franziska K¨uhn,IMT Toulouse, August 2019 3 Connection between SDEs and martingale problems

Aim: characterize weak solutions to SDEs of the form

dXt = b(Xt) dt + σ(Xt) dBt,X0 ∼ µ (3.1) where (Bt)t≥0 is a Brownian motion and µ a probability measure. Here for simplicity only dimen- sion 1. Everything generalizes to higher dimensions! Standing assumption:

• b ∶ R → R and σ ∶ R → R are locally bounded,

3.1 Definition A weak solution to (3.1) is a triple (X,B), (Ω, F, P), (Ft)t≥0 consisting of • a probability space (Ω, F, P),

• a complete filtration (Ft)t≥0 of sub-σ-algebras of F,

• a Brownian motion (Bt, Ft)t≥0,

• a continuous adapted process (Xt, Ft)t≥0 such that P(X0 ∈ ⋅) = µ(⋅) and t t Xt − X0 = b(Xs) ds + σ(Xs) dBs P-a.s. S0 S0 Important: We are free to choose the probability space and the Brownian motion.

Example The SDE dXt = sgn(Xt) dBt,X0 = 0, has a weak solution but not a strong solution.

Idea. Existence of weak solution: Let (Wt)t≥0 be a Brownian motion (on some probability space) and set t W Bt ∶= sgn(Ws) dWs Xt ∶= Wt Ft ∶= Ft S0 By L´evy’scharacterization, (Bt, Ft)t≥0 is a Brownian motion. Moreover, 2 dBt = sgn(Xt) dXt Ô⇒ dXt = (sgn(Xt)) dXt = sgn(Xt) dBt.

X SXS If there were a strong solution (Xt)t≥0, then (Xt)t≥0 would be a Brownian motion and Ft ⊆ Ft . This is impossible, see e.g. [11, Example 19.16]. Good source for further (counter)examples: [2]. 2 Let (Xt)t≥0 be a solution to (3.1). For f ∈ Cb (R) it follows from Itˆo’sformula that t t ′ 1 ′′ f(Xt) − f(X0) = f (Xs) dXs + f (Xs) d⟨X⟩s S0 2 S0 t t ′ = f (Xs)σ(Xs) dBs + Lf(Xs) ds S0 S0

8 with where ro for Proof ( rbe ihiiildistribution initial with problem Lemma 3.3 true! (almost) also is Converse distribution initial with problem Lemma 3.2 if e.g. martingale is side Right-hand Define (iii). B tp1, Step By (ii). X t ) t a ≥ n pick and Define atnae Write martingale. for tingale tflosfo ´v’ hrceiainthat L´evy’s characterization from follows It and Show Aim: martingales. local are salclmrigl hc so one aito.Hence variation. bounded of is which N martingale local a is 0 ( . x t ) slclmartingale. local is σ ∶ = ( If If U ⋅ σ ) W t 2 ( ( ( Lf ∶ X t f x X = one wyfrom away bounded ∶ ) u t = ∈ t X 2 ( ) ) ( x C t ∫ ∈ t t ≥ = ≥ 0 ) c − 2 t 0 C 0 f σ ∶ ( 1 = c X 2 ,then (3.1), to solution weak a is ( R ~ sacniuu rcs hc saslto othe to solution a is which process continuous a is ( R ( X σ b x ) 0 R U ( t ( ) t − x uhthat such ) X σ t ) 2 ∶ = = parts = ) S ( − = tflosthat follows it − s f 2 2 x 2 0 ) S ′ f ∫ S S ) ( S t dU 0 ( T 0 2 x 0 0 t b 0 X t ) ( ) σ t t S µ t s Consequently, . ( ( µ X b 0 ( 2 + hnteeeit eksolution weak a exists there then , 0 N U X ( ) hn y(ii), by then, X ( s t X (with 0 t X t 2 1 f t − ) S t . f − ⟨ s ∶ ( a − ds s = 0 − S W ) x ( ) U a opc upr or support compact has s X 0 ds x V ) X ds Since (i). τ s b ⟩ t ) t s = ⟨ k t ) ( L s f Lf and ) − U + b X ) ( = = u ∶ b ′′ ( = b M endi (3.2)). in defined 2 ⟩ S ´¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¸¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¹¶ ( ( u X ( ( S N ( t X X x inf x 0 ) X X t 0 u s = t ) )Œ t udU du s 0 ) ) t s s − ) σ V U { 9 t for ) ∫ ds ) σ ≥ t ds t ( 0 R = t ds 0 ( ds t 2 M X U > ∶ W ( = σ t b t salclmrigl o any for martingale local a is 1 S − X s ( x 0; s − t 2 oa martingale! local u where = X t ) x ( S 2 ) s ‹ S dW S X ) ∶ ≤ t t ) X 2 = S S ≥ ∇ ( 0 − s k 0 d 0 t 0 X u f t ) S then ; ⟨ s t t X saBona oin Moreover, motion. Brownian a is ( f U ( ≥ ds t S b Def. X ) x ′ 0 ( 2 ⟩ ( k s t ) ..that i.e. , t X ≥ s = X } − t ) 0 + = b s s U σ S − ( ) ovsthe solves R M ) 2 1 t. X 0 σ X ds u sbudd Hence: bounded. is tr t t = u ( ( s ∧ ( t ( ( X  X ) 2 τ ,adso and 0, X − a b 2 ˜ k X s 0 ( ( t X ) = ) X x ) s U dB t b 0 ) − M ≥ r t ( ∇ . 2 0 ) X ∫ t − s ( ( 2 f rds dr 0 with (3.1) to ∧ t . ,C L, C L, f s τ ∫ Lu ) k ( 0 U x t + samartingale. a is σ t ))‘ 2 ( c c 2 2 σ u X ( − ( ( 2 X R R ∈ s ( ∫ ) X s )) )) 0 C t ) ds s σ 2 -martingale -martingale 2 )) ( 2 ds R ( ( samar- a is ds X X ) ˜ slocal is Idea: . s t ) ) (3.2) t ds ≥ 0 = = d

Skript: “Connection between Martingale Problems and Markov Processes” by Franziska K¨uhn,IMT Toulouse, August 2019 (ii) ( . Remark 3.7 distribution initial with problem Proof. distribution initial any for exists there (3.1). on Then assumptions regularity bounded. any and require not do Corollary 3.5 3.6 Corollary and 3.4 Corollary that Note exist there 3.3, Lemma (i) Corollary 3.5 Proof. Corollary 3.4 motion Brownian Remark where Proof. where w e eea xsec eutfrweak for result existence general a get we 3.4(i)]. 3.6 Theorem Corollary [7, e.g. to see similar (3.3), to and solutions [10], cf. ’continuous’), Te’continuous’ The (ii). the to solution continuous a exists There (ii). Y Frayiiildistribution initial any For (i). (3.1). SDE the to solution weak a exists There (i). t ) • • • t Ô⇒ Ô⇒ ≥ C Choose L bution hnlim Then and 0 ν ( i ucst hwta hr xssaslto othe to solution a exists there that show to suffices it 3.4 Corollary By 3.3. and 3.2 Lemma from Immediate i oepoaiiymeasure probability some Fix c = 2 d J 2.1 Section cf. principle, maximum positive the satisfies steLeymaueof L´evy measure the is ( t ( Used R ) (ipe hnfis part). first than (simpler 3.2 Lemma use reasoning, similar (i): Y i) Let (ii): ˜ t ) ≥ t ) 0 sueta h coefficients the that Assume TFAE: Let µ sdnein dense is h bv eut a eetne oSE fteform the of SDEs to extended be can results above The t ( Lf ≥ sajm ´v rcs.Ne oreplace to Need L´evy process. jump a is . f n 0 σ n → ec,b (i), by Hence, . 2 ( µ ) ∞ x n ( > eapoaiiymaue TFAE: measure. probability a be ) W ∈ f ( N nyi ii.Frgeneral For (iii). in only 0 = = n X t ⊆ ) + b = t sup n t ( ( ) C ≥ S ,C L, C ≥ 1, x t 0 ( ≥ 1 c 2 y ) ∞ X 0 ( with ≠ ˜ (Y ∇ R 0 ( t dX n and c f 2 ( lim R ) f ) → ( f ( n ( ) R ∞ and x uhta 0 that such ( t Y ( X U x ∞ and J ) )) = f ( t µ µ t t + + n Y ) ′ ) b mrigl rbe swell-posed. is problem -martingale + hc assumptions: Check 2.6. Theorem apply we end, this To . = t ( t t ( hr xssauiu ekslto oteSE(3.1). SDE the to solution weak unique a exists there ( g ≥ 2 1 ≥ ) x Y X Y ∫ Lf 0 ( ˜ 0 t tr Lf ) 0 µ ≥ x t rmi ai (without valid remain ,3.5 3.4 Corollary details. for [10] See . t t = d ) 0 − ( = . ) σ n a ) ( ∈ f y uhthat such (3.1) SDE the solving ( ecniuu ouin otemrigl rbe.By problem. martingale the to solutions continuous be 0 X Y ( n ) dt C b ˜ X , ∞ x ( ≤ − n t ∶ ∞ x lim ) s ) + ) → R ∇ f ) f t ) ( ≥ ≤ ∞ σ n dW ( R 2 = 0 → x ( σ f 1 ≤ f ( ) 10 ( X ⎩ ⎪ ⎪ ⎨ ⎪ ⎪ ⎧ ) = ( ,C L, n i ′′ + R s ,sup 1, pl atnaerpeetto hoe ofind to theorem representation martingale apply ) o any for 1 0 x ∇ − t ( r[6]. or [4] e.g. see , − , , (Y ( )) x and Y ) ˜ ) b c 2 t dB f S S Y ) ( = x x ( ∞ t R L n S S ≥ x σ 0 t ≤ ≥ 0 )) Y + ) f + by ∶ f Ô⇒ = n, n d ⋅ mrigl rbe ihiiildistri- initial with problem -martingale R Y n g ∈ ( σ ( + Y ( g C µ → Y Y C X ( 1 ∞ t c 2 x 2 nnepoie ekslto to solution weak (non-explosive) a ) t n R ( lim ) − ) t < → R ≥ y ) sup 0 r continuous are (3.1) SDE the of ∞ ∞ ) ) dJ n . 1 , Lf ( t and Y 0 f , n 1 n ) = Y (S C y 0 b ( 2 ( S)) X ,C L, < t ν ∞ ) ( t ≥ c 2 dy 0 ( R ) = d )) ( -martingale X ˜ t b ) and t ≥ 0 (3.3) and σ .

Skript: “Connection between Martingale Problems and Markov Processes” by Franziska K¨uhn,IMT Toulouse, August 2019 4 Markov processes

4.1 Semigroups

Throughout, (Ω, F) is a measurable space. x 4.1 Definition A Markov process is a tuple (Xt, Ft, P ) consisting of

• a filtration (Ft)t≥0, d • an adapted R -valued c`adl`agprocess (Xt, Ft)t≥0, x d • a family of probability measures P , x ∈ R such that the Markov property

x Xs x E (u(Xt)S Fs) = E u(Xt−s) P -a.s. (4.1) d d holds for any s ≤ t, x ∈ R and u ∈ Bb(R ). (Implicitly: everything is measurable.) We call x d d Ptu(x) ∶= E u(Xt), t ≥ 0, x ∈ R , u ∈ Bb(R ) the semigroup associated with (Xt)t≥0.

4.2 Proposition Let (Pt)t≥0 be the semigroup associated with a Markov process (Xt)t≥0.

(i). Pt+s = PtPs for all s ≤ t (semigroup property)

(ii).0 ≤ Ptu ≤ 1 for any 0 ≤ u ≤ 1 (sub Markov property)

(iii). Pt1 = 1 (conservative) d Proof. (i). Fix u ∈ Bb(R ). By the Markov property (4.1) and the tower property of conditional expectation, we have

x x x x Xt Pt+su(x) = E u(Xt+s) = E E (u(Xt+s)S Ft) = E E u(Xs)

= PtPsu(x).

x (ii). Clear from Ptu(x) = E u(Xt) x d (iii). Pt1(x) = E 1Xt = 1 (since we assume Xt ∈ R for all t ≥ 0).

4.3 Definition A Markov process (Xt)t≥0 is called a if the associated semigroup (Pt)t≥0 is a Feller semigroup, i.e. d d (i). Ptf ∈ C∞(R ) for any t ≥ 0, f ∈ C∞(R ) (Feller property) d (ii). YPtf − fY∞ → 0 as t → 0 for any f ∈ C∞(R ) (strong continuity at t = 0). 4.4 Example Any L´evyprocess, i.e. any process with stationary and independent increments, is a Feller process. Examples: Brownian motion, , . . .

11 ( by Definition 4.6 operator (unbounded) some for . Proposition 4.7 of generator infinitesimal called is SDE the Example 4.5 i:Caatrz semigroup Characterize Aim: generator Infinitesimal 4.2 Remark Remark any for solution weak unique a has so h form the of is (4.2) to solution Any equation Cauchy–Abel of version” “operator is rmisvldi ereplace we if valid remains 4.5 Example (ii). (ii). A, If (i). D ( ( in contained spaces function D A f ( n u )) A ) ∈ n ) Then . I eea,i sdffiutt determine to difficult is it general, In (i). of boundedness on assumption The (i). ∈ D snt“o ml”since small” “too not is N ( ⊆ Let A Let Let D ) ( then ( A D B ( ( ) X X ( t Au A ) uhthat such t t t ) P ) ) ≥ t t 0 t ≥ ≥ ∶ ∶ u = = 0 0 eaBona oinadlet and motion Brownian a be ∈ › lim t eaFle rcs ihsemigroup with process Feller a be eaFle rcs ihsemigroup with process Feller a be → φ u D 0 ( X ( ∈ ( t P P A t C + A Y t t f = u ) ∞ ) o omk es f(4.3)? of sense make to How . s t P n t ( D ) − b o all for ≥ ( X t x D dt − ( φ 0 R d = ( u = X ( t ∈ A ( u d by P ) φ u , t A e t ) R Y ) t ) ( t ) tA ≥ ; ∞ e.g. , ) u t d = 0 ∃ dt one ) t hntesolution the then , sdnein dense is g = → φ φ (and > P + ( ∈ ( AP t s and 0 0. 0 + σ B C prtr Idea: operator. ∈ C ) ) s ( ,t s, , ∞ e D t c 12 ∞ X o ealddiscussion. detailed a for [7] see L´evy process, a by = ( u at ( P ( ( t P R = A ) R t o oeconstant some for A b t ) d dB P ) P d , t ) C ≥ ) t = s σ 0 u t Au, ∶ ∞ t ⊆ ). D lim t dt X , hoe 3.6]. Theorem [1, e.g. see relaxed, be can ≥ d → ( D ,σ b, ( R 0 0 P A ( \ φ , d t A ) W ) P yial n re ofid“nice” find to tries one Typically . t ( ebuddcniuu apns If mappings. continuous bounded be ..frany for i.e. , ) = t X . 0 u ( ( 0 t P t P > − = ) t t t 0 ) u ) ≥ x t (4.2) cts . t t 0 ≥ ≥ − 0 0 saFle process. Feller a is a hnteoeao defined operator the Then . g n nntsmlgenerator infinitesimal and > \ ∞ .Hne eexpect we Hence, 0. u = ∈ 0 C

∞ ( R d ) hr exists there (4.3)

Skript: “Connection between Martingale Problems and Markov Processes” by Franziska K¨uhn,IMT Toulouse, August 2019 a osatepcainwrt to w.r.t. expectation constant has proof. of Sketch Proof. for Proof. δ ( problems Proposition martingale 4.8 to processes Markov From 4.3 npoaiitcway: probabilistic in (4.5) Write o n tpigtime stopping any for Corollary 4.9 problem! martingale of glimpse First s akvproperty: Markov Use Frany For (ii). B i n h udmna hoe fcalculus, of theorem fundamental the and (i) By (ii). X x . t s ) t ≤ ≥ s hsshows This 0 t adotoa stopping. optional and 4.8 Proposition Use Fix < ..ne oshow to need i.e. , s(...to (w.r.t. is E ie lim gives 0 x E u ‹ x u Dni’ formula) (Dynkin’s ( ∈ M ( u D X Let t ∈ u ( t E A D ) P S Fix (i). x − F t ) ( ( h u u edt hwthat show to Need . A X s P S ↓ ( 0 ) ∈ ) x τ s t X , = P ouint the to solution a ) t D thlsthat holds it P t with t Au E ) − ( x h \ u A − x − ) u P h ( E ‹ − ) ∈ u E eaFle rcs ihgenerator with process Feller a be X s P u , x E ( M D P Let x t ( x r P AP ‹ x u f t X ( ) ) t u τ u t u ( A s = u dr P ( t = X − t < ∶ ( ) ) X P = P u − t X S τ u − P ∞ x S and t t u u ) = 0 F . Au t ) t − S ( , . u − t = − s X P P s u E  f Hence, . − t x S t S t t x Au = ( ( = (4.6) = ) ) Au 0 P s Au > M x ( − S t eaFle rcs ihgenerator with process Feller a be E E t t A, ) .For 0. Au t Au 0 ds ( u u X x n lim and ( d = ,x = X t ) ( X D ( s = t P E X P ( \ u ( r ≥ 13 X s ( u s ∞ X 0 ) s x ( 0 ) A d s uds Au ds u ( X dr ) ‹ smrigl.B definition, By martingale. is s u ds, r ~ X u ≤ )) s S dtP ) − t ( S ) > t \ h S dr ( mrigl rbe ihiiildistribution initial with problem -martingale X − S 0 ↓ F P = s 0 ,τ 0 0 t s F )) s s = S u S ) ) t P  u s F . s Au Af S 0 t = ) − + − − s t 0 h −  AP S P u u ∈ u h t ( ( E − 0 = ( s AP X X D P − x uds. Au t x t u − s t s u ( ) ‹ Au u ( s ) ( ) A s . − A, S X E ds. u ds ds = ) 0 S X \ s x , D  t P ) ∞ 0 s − . ( t ( s s ∈ Au Au ÐÐ→ A s Au Au R → )) d 0 iia aclto for calculation Similar . ( ( ( t , n let and , X X X 0 h ≥ . r s )) + ( ) 0 h A, dr . dh ) dh D = ! ( x M S A F ∈ s )) u s R  Then . d Then . (4.4) (4.5) (4.6)

Skript: “Connection between Martingale Problems and Markov Processes” by Franziska K¨uhn,IMT Toulouse, August 2019 C hc prtr pera eeaoso elrpoess o ognrtr fFle processes Feller of generators Theorem do 4.12 How processes? Feller of like? generators look as appear operators Which rbes e.g. [ 3]. problems, or 3.10] Remark Proposition 4.14 1, [ see (4.7) of statement rigorous For generally More possibility: Only slowly. too moving is diffusion i.e. role, a play jumps. doesn’t part diffusion Observation: Interpretation: Proposition family 4.13 the from comes part jump The the that problem. Theorem pro- martingale 4.10 Feller to for solution inequality of maximal uniqueness a difficult: e.g. More formula, this of applications interesting cesses. many for [1] See Then Remark 4.11 the Then ( where 4.12: Theorem of Interpretation A, c ∞ Af lim t → • • • • • ( D 0 R ( ( ne hc supin on assumptions which Under characteristics b 4.12) Theorem cf. is assumptions which Under Q ν P d x A ( ( ( ) x ( ) b x ,dy x, )) x ( ( ( ⊆ ) = X x ) A, ( ..frany for i.e. , ∈ D A, ) b ∈ t ν Q , ( R ( D ) ( ∈ t R x A D d Let X D ) x ( esr satisfying measure ayoe usin neitneo elrpoessadascae martingale associated and processes Feller of existence on questions open Many hoe 2.21]) Theorem ([1, d ( ) ∇ A ) × s x + mrigl rbe swell-posed. is problem -martingale ∶ − d ([9]) f then ) = )) A , ( A ν , ( ν oiiesemidefinite positive X D mrigl rbe swell-posed. is problem -martingale x ) ( ( ) ( ( ) t ,A x, ,dy x, Let ) = b A = + A t ( X ≥ sae)poaiiyt oevr ucl from quickly very move to probability (scaled) ) f x ieiodta rcs up urn position current jumps process that likelihood S 0 lim t 2 1 C ) → ) t sacr for core a is ∈ tr c eaFle rcs ihgenerator with process Feller a be ∞ ( )) Q , = 0 ≈ X D ( ( P , ieiodta rcs up from jumps process that likelihood R S Q ( t ( d x x ) 0 x A ) ( ( t t ) ∈ x ≥ X ) a ersnaino h form the of representation a has ν , b 0 ) R ( hr is there t Let ∫ ∇ C X eaFle rcs ihcharacteristics with process Feller a be ( d ∈ t 2 ,dy x, c min steso-called the is , ∞ s x f ( − ( ( b ( ( ) + R ( A, ( x X { ds x ν A d )) )) 1 ) t ( ) D ( , ) ) Q , ,dy x, ? + + S f t oefrtegnrtr ipratfrwell-posedness, for (important generator? the for core a ( y ≥ = n A S S 0 S ( 2 ) ν 0 x )) } y n eaFle rcs ihgenerator with process Feller a be )) ( t ≠ ) ∈ ν ,A x, 14 sos nparticular, in shows, 4.10 Theorem therefore and , 0 ν , N » ( : ( ,dy x, f uhthat such Q ( ,dy x, ( ) ( x characteristics A , X + ) s − y )) < ) ) ∞ dB − osteeeitaFle rcs with process Feller a exist there does . ( Y f ∈ A, s f ( B n + x D ( X − ) uppr (4.7) part jump R ( − 0 f d A = Y y ƒ{ ossigof consisting )) ∞ ∇ ⋅ x 0 Let . → X }) X to f 0 s . ( and 0 − x x = ) to ( + D x 1 b A ( ( noteset the into X ⊆ 0 x , Y ) D 1 s Af Q , − ) (S ( + A ( n y ( A, A S)) ) x − ) eacr of core a be D Af ν , ν ( ( ( A ,dy x, Y x ,dy x, ∞ )) + → If . A ) )) 0. .

Skript: “Connection between Martingale Problems and Markov Processes” by Franziska K¨uhn,IMT Toulouse, August 2019 References

[1] B¨ottcher, B., Schilling, R. L., Wang, J.: L´evy-Type Processes: Construction, Approximation and Sample Path Properties. Springer, 2014. [2] Cherny, A.S., Engelbert, H.J.: Singular Stochastic Differential Equations. Springer, 2005

[3] Cinlar, E., Jacod, J.: Representation of Markov processes in terms of Wiener processes and Poisson random masures. In: Seminar on Stochastic Processes, Birk¨auser, 1981, pp.. 159–242. [4] Ethier, S. N., Kurtz, T. G.: Markov Processes: Characterization and Convergence. Wiley, 1986.

[5] Hoh, W.: Pseudo-Differential Operators Generating Markov Processes. Habilitationsschrift. Universit¨atBielefeld, Bielefeld 1998. [6] Karatzas, I., Shreve, S.E.: Brownian Motion and . Springer, 1991. [7] K¨uhn,F.: Solutions of L´evy-driven SDEs with unbounded coefficients as Feller processes Proc. Amer. Math. Soc. 146 (2018), 3591–3604. [8] K¨uhn,F.: Existence of (Markovian) solutions to martingale problems associated with L´evy- type operators. Preprint, arXiv 1803.05646. [9] K¨uhn,F., Schilling, R. L.: On the domain of fractional Laplacians and related generators of Feller processes. J. Funct. Anal. 276 (2019), 2397–2439. [10] Kurtz, T. G.: Equivalence of Stochastic Equations and Martingale Problems In D. Crisan, Stochastic Analysis 2010. Springer 2011, pp. 113–130. [11] Schilling, R. L., Partzsch, L.: Brownian Motion: An Introduction to Stochastic Processes. De Gruyter, 2014 (2nd edition).

15