Martingale Problems and Stochastic Equations for Markov Processes

Total Page:16

File Type:pdf, Size:1020Kb

Martingale Problems and Stochastic Equations for Markov Processes Martingale problems and stochastic equations for Markov processes • Review of basic material on stochastic processes • Characterization of stochastic processes by their martingale properties • Weak convergence of stochastic processes • Stochastic equations for general Markov process in Rd • Martingale problems for Markov processes • Forward equations and operator semigroups • Equivalence of martingale problems and stochastic differential equations • Change of measure • Filtering • Averaging •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit • Control • Exercises • Glossary • References •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit 1. Review of basic material on stochastic processes • Filtrations • Stopping times • Martingales • Optional sampling theorem • Doob’s inequalities • Stochastic integrals • Local martingales • Semimartingales • Computing quadratic variations • Covariation • Ito’sˆ formula •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Conventions and caveats 4 State spaces are always complete, separable metric spaces (sometimes called Polish spaces), usually denoted (E, r). All probability spaces are complete. All identities involving conditional expectations (or conditional probabilities) only hold almost surely (even when I don’t say so). If the filtration {Ft} involved is obvious, I will say adapted, rather than {Ft}- adapted, stopping time, rather than {Ft}-stopping time, etc. All processes are cadlag (right continuous with left limits at each t > 0), unless otherwise noted. A process is real-valued if that is the only way the formula makes sense. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit References 5 Kurtz, Lecture Notes for Math 735 http://www.math.wisc.edu/˜kurtz/m735.htm Seppalainen, Basics of Stochastic Analysis http://www.math.wisc.edu/˜seppalai/sa-book/etusivu.html Ethier and Kurtz, Markov Processes: Characterization and Convergence Protter, Stochastic Integration and Differential Equations, Second Edition •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Filtrations 6 (Ω, F,P ) a probability space Available information is modeled by a sub-σ-algebra of F Ft information available at time t {Ft} is a filtration. t < s implies Ft ⊂ Fs A stochastic process X is adapted to {Ft} if X(t) is Ft-measurable for each t ≥ 0. An E-valued stochastic process X adapted to {Ft} is {Ft}-Markov if E[f(X(t + r))|Ft] = E[f(X(t + r))|X(t)], t, r ≥ 0, f ∈ B(E) An R-valued stochastic process M adapted to {Ft} is an {Ft}-martingale if E[M(t + r)|Ft] = M(t), t, r ≥ 0 •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Stopping times 7 τ is an {Ft}-stopping time if for each t ≥ 0, {τ ≤ t} ∈ Ft. For a stopping time τ, Fτ = {A ∈ F : {τ ≤ t} ∩ A ∈ Ft, t ≥ 0} Exercise 1.1 1. Show that Fτ is a σ-algebra. 2. Show that for {Ft}-stopping times σ, τ, σ ≤ τ implies that Fσ ⊂ Fτ . In particular, Fτ∧t ⊂ Ft. ∞ 3. Let τ be a discrete {Ft}-stopping time satisfying {τ < ∞} = ∪k=1{τ = tk} = Ω. Show that Fτ = σ{A ∩ {τ = tk} : A ∈ Ftk , k = 1, 2,...}. 4. Show that the minimum of two stopping times is a stopping time and that the maxi- mum of two stopping times is a stopping time. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Examples and properties 8 Define Ft+ ≡ ∩s>tFs. {Ft} is right continuous if Ft = Ft+ for all t ≥ 0. If {Ft} is right continuous, then τ is a stopping time if and only if {τ < t} ∈ Ft for all t > 0. h If K ⊂ E is closed, τK = inf{t : X(t) or X(t−) ∈ K} is a stopping time, but inf{t : X(t) ∈ K} may not be; however, if {Ft} is right continuous and complete, then for any B ∈ B(E), τB = inf{t : X(t) ∈ B} is an {Ft}-stopping time. This result is a special case of the debut theorem, a very technical result from set theory. Note that {ω : τB(ω) < t} = {ω : ∃s < t 3 X(s, ω) ∈ B} = projΩ{(s, ω): X(s, ω) ∈ B, s < t} Piecewise constant approximations > 0, τ0 = 0, τi+1 = inf{t > τi : r(X(t),X(τi )) ∨ r(X(t−),X(τi )) ≥ } Define X (t) = X(τi ), τi ≤ t < τi+1. Then r(X(t),X (t)) ≤ . If X is adapted to {Ft}, then the {τi } are {Ft}-stopping times and X is {Ft}- adapted. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Martingales 9 An R-valued stochastic process M adapted to {Ft} is an {Ft}-martingale if E[M(t + r)|Ft] = M(t), t, r ≥ 0 Every martingale has finite quadratic variation: X 2 [M]t = lim (M(t ∧ ti+1) − M(t ∧ ti)) where 0 = t0 < t1 < ···, ti → ∞, and the limit is in probability as max(ti+1 −ti) → 0. More precisely, for > 0 and t0 > 0, X 2 lim P {sup |[M]t − lim (M(t ∧ ti+1) − M(t ∧ ti)) | > } = 0. t≤t0 For standard Brownian motion W , [W ]t = t. Exercise 1.2 Let N be a Poisson process with parameter λ. Then M(t) = N(t) − λt is a martingale. Compute [M]t. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Optional sampling theorem 10 A real-valued process is a submartingale if E[|X(t)|] < ∞, t ≥ 0, and E[X(t + s)|Ft] ≥ X(t), t, s ≥ 0. If τ1 and τ2 are stopping times, then E[X(t ∧ τ2)|Fτ1 ] ≥ X(t ∧ τ1 ∧ τ2). If τ2 is finite a.s. E[|X(τ2)|] < ∞ and limt→∞ E[|X(t)|1{τ2>t}] = 0, then E[X(τ2)|Fτ1 ] ≥ X(τ1 ∧ τ2). •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Square integrable martingales 11 M a martingale satisfying E[M(t)2] < ∞. Then 2 M(t) − [M]t is a martingale. In particular, for t > s 2 E[(M(t) − M(s)) ] = E[[M]t − [M]s]. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Doob’s inequalities 12 Let X be a submartingale. Then for x > 0, P {sup X(s) ≥ x} ≤ x−1E[X(t)+] s≤t P {inf X(s) ≤ −x} ≤ x−1(E[X(t)+] − E[X(0)]) s≤t If X is nonnegative and α > 1, then α α E[sup X(s)α] ≤ E[X(t)α]. s≤t α − 1 Note that by Jensen’s inequality, if M is a martingale, then |M| is a submartingale. In particular, if M is a square integrable martingale, then E[sup |M(s)|2] ≤ 4E[M(t)2]. s≤t •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Stochastic integrals 13 Definition 1.3 For cadlag processes X, Y , Z t X− · Y (t) ≡ X(s−)dY (s) 0 X = lim X(ti)(Y (ti+1 ∧ t) − Y (ti ∧ t)) max |ti+1−ti|→0 whenever the limit exists in probability. Sample paths of bounded variation: If Y is a finite variation process, the stochastic integral exists (apply dominated convergence theorem) and Z t Z X(s−)dY (s) = X(s−)αY (ds) 0 (0,t] αY is the signed measure with αY (0, t] = Y (t) − Y (0) •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Existence for square integrable martingales 14 If M is a square integrable martingale, then 2 E[(M(t + s) − M(t)) |Ft] = E[[M]t+s − [M]t|Ft] For partitions {ti} and {ri} hX E X(ti)(M(ti+1 ∧ t) − M(ti ∧ t)) X 2 − X(ri)(M(ri+1 ∧ t) − M(ri ∧ t)) Z t 2 = E (X(t(s−)) − X(r(s−))) d[M]s 0 Z 2 = E (X(t(s−)) − X(r(s−))) α[M](ds) (0,T ] t(s) = ti for s ∈ [ti, ti+1) r(s) = ri for s ∈ [ri, ri+1) •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Cauchy property 15 Let X be bounded by a constant. As sup(ti+1 − ti) + sup(ri+1 − ri) → 0, the right side converges to zero by the dominated convergence theorem. {ti} P MX (t) ≡ X(ti)(M(ti+1 ∧ t) − M(ti ∧ t)) is a square integrable martingale, so X E sup X(ti)(M(ti+1 ∧ t) − M(ti ∧ t)) t≤T X 2 − X(ri)(M(ri+1 ∧ t) − M(ri ∧ t)) Z 2 ≤ 4E (X(t(s−)) − X(r(s−))) α[M](ds) (0,t] A completeness argument gives existence of the stochastic integral and the unifor- mity implies the integral is cadlag. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Local martingales 16 Definition 1.4 M is a local martingale if there exist stopping times {τn} satisfying τ1 ≤ τn τn τ2 ≤ · · · and τn → ∞ a.s. such that M defined by M (t) = M(τn ∧ t) is a martingale. τn M is a local square-integrable martingale if the τn can be selected so that M is square integrable. {τn} is called a localizing sequence for M. Remark 1.5 If {τn} is a localizing sequence for M, and {γn} is another sequence of stop- ping times satisfying γ1 ≤ γ2 ≤ · · ·, γn → ∞ a.s. then the optional sampling theorem implies that {τn ∧ γn} is localizing. •First •Prev •Next •Last •Go Back •Full Screen •Close •Quit Local martingales with bounded jumps 17 Remark 1.6 If M is a continuous, local martingale, then τn = inf{t : |M(t)| ≥ n} will be a localizing sequence. More generally, if |∆M(t)| ≤ c for some constant c, then τn τn = inf{t : |M(t)|∨|M(t−)| ≥ n} will be a localizing sequence. Note that |M | ≤ n+c, so M is local square integrable.
Recommended publications
  • Integral Representations of Martingales for Progressive Enlargements Of
    INTEGRAL REPRESENTATIONS OF MARTINGALES FOR PROGRESSIVE ENLARGEMENTS OF FILTRATIONS ANNA AKSAMIT, MONIQUE JEANBLANC AND MAREK RUTKOWSKI Abstract. We work in the setting of the progressive enlargement G of a reference filtration F through the observation of a random time τ. We study an integral representation property for some classes of G-martingales stopped at τ. In the first part, we focus on the case where F is a Poisson filtration and we establish a predictable representation property with respect to three G-martingales. In the second part, we relax the assumption that F is a Poisson filtration and we assume that τ is an F-pseudo-stopping time. We establish integral representations with respect to some G-martingales built from F-martingales and, under additional hypotheses, we obtain a predictable representation property with respect to two G-martingales. Keywords: predictable representation property, Poisson process, random time, progressive enlargement, pseudo-stopping time Mathematics Subjects Classification (2010): 60H99 1. Introduction We are interested in the stability of the predictable representation property in a filtration en- largement setting: under the postulate that the predictable representation property holds for F and G is an enlargement of F, the goal is to find under which conditions the predictable rep- G G resentation property holds for . We focus on the progressive enlargement = (Gt)t∈R+ of a F reference filtration = (Ft)t∈R+ through the observation of the occurrence of a random time τ and we assume that the hypothesis (H′) is satisfied, i.e., any F-martingale is a G-semimartingale. It is worth noting that no general result on the existence of a G-semimartingale decomposition after τ of an F-martingale is available in the existing literature, while, for any τ, any F-martingale stopped at time τ is a G-semimartingale with an explicit decomposition depending on the Az´ema supermartingale of τ.
    [Show full text]
  • CONFORMAL INVARIANCE and 2 D STATISTICAL PHYSICS −
    CONFORMAL INVARIANCE AND 2 d STATISTICAL PHYSICS − GREGORY F. LAWLER Abstract. A number of two-dimensional models in statistical physics are con- jectured to have scaling limits at criticality that are in some sense conformally invariant. In the last ten years, the rigorous understanding of such limits has increased significantly. I give an introduction to the models and one of the major new mathematical structures, the Schramm-Loewner Evolution (SLE). 1. Critical Phenomena Critical phenomena in statistical physics refers to the study of systems at or near the point at which a phase transition occurs. There are many models of such phenomena. We will discuss some discrete equilibrium models that are defined on a lattice. These are measures on paths or configurations where configurations are weighted by their energy with a preference for paths of smaller energy. These measures depend on at leastE one parameter. A standard parameter in physics β = c/T where c is a fixed constant, T stands for temperature and the measure given to a configuration is e−βE . Phase transitions occur at critical values of the temperature corresponding to, e.g., the transition from a gaseous to liquid state. We use β for the parameter although for understanding the literature it is useful to know that large values of β correspond to “low temperature” and small values of β correspond to “high temperature”. Small β (high temperature) systems have weaker correlations than large β (low temperature) systems. In a number of models, there is a critical βc such that qualitatively the system has three regimes β<βc (high temperature), β = βc (critical) and β>βc (low temperature).
    [Show full text]
  • Superprocesses and Mckean-Vlasov Equations with Creation of Mass
    Sup erpro cesses and McKean-Vlasov equations with creation of mass L. Overb eck Department of Statistics, University of California, Berkeley, 367, Evans Hall Berkeley, CA 94720, y U.S.A. Abstract Weak solutions of McKean-Vlasov equations with creation of mass are given in terms of sup erpro cesses. The solutions can b e approxi- mated by a sequence of non-interacting sup erpro cesses or by the mean- eld of multityp e sup erpro cesses with mean- eld interaction. The lat- ter approximation is asso ciated with a propagation of chaos statement for weakly interacting multityp e sup erpro cesses. Running title: Sup erpro cesses and McKean-Vlasov equations . 1 Intro duction Sup erpro cesses are useful in solving nonlinear partial di erential equation of 1+ the typ e f = f , 2 0; 1], cf. [Dy]. Wenowchange the p oint of view and showhowtheyprovide sto chastic solutions of nonlinear partial di erential Supp orted byanFellowship of the Deutsche Forschungsgemeinschaft. y On leave from the Universitat Bonn, Institut fur Angewandte Mathematik, Wegelerstr. 6, 53115 Bonn, Germany. 1 equation of McKean-Vlasovtyp e, i.e. wewant to nd weak solutions of d d 2 X X @ @ @ + d x; + bx; : 1.1 = a x; t i t t t t t ij t @t @x @x @x i j i i=1 i;j =1 d Aweak solution = 2 C [0;T];MIR satis es s Z 2 t X X @ @ a f = f + f + d f + b f ds: s ij s t 0 i s s @x @x @x 0 i j i Equation 1.1 generalizes McKean-Vlasov equations of twodi erenttyp es.
    [Show full text]
  • A Stochastic Processes and Martingales
    A Stochastic Processes and Martingales A.1 Stochastic Processes Let I be either IINorIR+.Astochastic process on I with state space E is a family of E-valued random variables X = {Xt : t ∈ I}. We only consider examples where E is a Polish space. Suppose for the moment that I =IR+. A stochastic process is called cadlag if its paths t → Xt are right-continuous (a.s.) and its left limits exist at all points. In this book we assume that every stochastic process is cadlag. We say a process is continuous if its paths are continuous. The above conditions are meant to hold with probability 1 and not to hold pathwise. A.2 Filtration and Stopping Times The information available at time t is expressed by a σ-subalgebra Ft ⊂F.An {F ∈ } increasing family of σ-algebras t : t I is called a filtration.IfI =IR+, F F F we call a filtration right-continuous if t+ := s>t s = t. If not stated otherwise, we assume that all filtrations in this book are right-continuous. In many books it is also assumed that the filtration is complete, i.e., F0 contains all IIP-null sets. We do not assume this here because we want to be able to change the measure in Chapter 4. Because the changed measure and IIP will be singular, it would not be possible to extend the new measure to the whole σ-algebra F. A stochastic process X is called Ft-adapted if Xt is Ft-measurable for all t. If it is clear which filtration is used, we just call the process adapted.The {F X } natural filtration t is the smallest right-continuous filtration such that X is adapted.
    [Show full text]
  • PREDICTABLE REPRESENTATION PROPERTY for PROGRESSIVE ENLARGEMENTS of a POISSON FILTRATION Anna Aksamit, Monique Jeanblanc, Marek Rutkowski
    PREDICTABLE REPRESENTATION PROPERTY FOR PROGRESSIVE ENLARGEMENTS OF A POISSON FILTRATION Anna Aksamit, Monique Jeanblanc, Marek Rutkowski To cite this version: Anna Aksamit, Monique Jeanblanc, Marek Rutkowski. PREDICTABLE REPRESENTATION PROPERTY FOR PROGRESSIVE ENLARGEMENTS OF A POISSON FILTRATION. 2015. hal- 01249662 HAL Id: hal-01249662 https://hal.archives-ouvertes.fr/hal-01249662 Preprint submitted on 2 Jan 2016 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. PREDICTABLE REPRESENTATION PROPERTY FOR PROGRESSIVE ENLARGEMENTS OF A POISSON FILTRATION Anna Aksamit Mathematical Institute, University of Oxford, Oxford OX2 6GG, United Kingdom Monique Jeanblanc∗ Laboratoire de Math´ematiques et Mod´elisation d’Evry´ (LaMME), Universit´ed’Evry-Val-d’Essonne,´ UMR CNRS 8071 91025 Evry´ Cedex, France Marek Rutkowski School of Mathematics and Statistics University of Sydney Sydney, NSW 2006, Australia 10 December 2015 Abstract We study problems related to the predictable representation property for a progressive en- largement G of a reference filtration F through observation of a finite random time τ. We focus on cases where the avoidance property and/or the continuity property for F-martingales do not hold and the reference filtration is generated by a Poisson process.
    [Show full text]
  • Lecture 19 Semimartingales
    Lecture 19:Semimartingales 1 of 10 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 19 Semimartingales Continuous local martingales While tailor-made for the L2-theory of stochastic integration, martin- 2,c gales in M0 do not constitute a large enough class to be ultimately useful in stochastic analysis. It turns out that even the class of all mar- tingales is too small. When we restrict ourselves to processes with continuous paths, a naturally stable family turns out to be the class of so-called local martingales. Definition 19.1 (Continuous local martingales). A continuous adapted stochastic process fMtgt2[0,¥) is called a continuous local martingale if there exists a sequence ftngn2N of stopping times such that 1. t1 ≤ t2 ≤ . and tn ! ¥, a.s., and tn 2. fMt gt2[0,¥) is a uniformly integrable martingale for each n 2 N. In that case, the sequence ftngn2N is called the localizing sequence for (or is said to reduce) fMtgt2[0,¥). The set of all continuous local loc,c martingales M with M0 = 0 is denoted by M0 . Remark 19.2. 1. There is a nice theory of local martingales which are not neces- sarily continuous (RCLL), but, in these notes, we will focus solely on the continuous case. In particular, a “martingale” or a “local martingale” will always be assumed to be continuous. 2. While we will only consider local martingales with M0 = 0 in these notes, this is assumption is not standard, so we don’t put it into the definition of a local martingale. tn 3.
    [Show full text]
  • Locally Feller Processes and Martingale Local Problems
    Locally Feller processes and martingale local problems Mihai Gradinaru and Tristan Haugomat Institut de Recherche Math´ematique de Rennes, Universit´ede Rennes 1, Campus de Beaulieu, 35042 Rennes Cedex, France fMihai.Gradinaru,[email protected] Abstract: This paper is devoted to the study of a certain type of martingale problems associated to general operators corresponding to processes which have finite lifetime. We analyse several properties and in particular the weak convergence of sequences of solutions for an appropriate Skorokhod topology setting. We point out the Feller-type features of the associated solutions to this type of martingale problem. Then localisation theorems for well-posed martingale problems or for corresponding generators are proved. Key words: martingale problem, Feller processes, weak convergence of probability measures, Skorokhod topology, generators, localisation MSC2010 Subject Classification: Primary 60J25; Secondary 60G44, 60J35, 60B10, 60J75, 47D07 1 Introduction The theory of L´evy-type processes stays an active domain of research during the last d d two decades. Heuristically, a L´evy-type process X with symbol q : R × R ! C is a Markov process which behaves locally like a L´evyprocess with characteristic exponent d q(a; ·), in a neighbourhood of each point a 2 R . One associates to a L´evy-type process 1 d the pseudo-differential operator L given by, for f 2 Cc (R ), Z Z Lf(a) := − eia·αq(a; α)fb(α)dα; where fb(α) := (2π)−d e−ia·αf(a)da: Rd Rd (n) Does a sequence X of L´evy-type processes, having symbols qn, converges toward some process, when the sequence of symbols qn converges to a symbol q? What can we say about the sequence X(n) when the corresponding sequence of pseudo-differential operators Ln converges to an operator L? What could be the appropriate setting when one wants to approximate a L´evy-type processes by a family of discrete Markov chains? This is the kind of question which naturally appears when we study L´evy-type processes.
    [Show full text]
  • Martingale Theory
    CHAPTER 1 Martingale Theory We review basic facts from martingale theory. We start with discrete- time parameter martingales and proceed to explain what modifications are needed in order to extend the results from discrete-time to continuous-time. The Doob-Meyer decomposition theorem for continuous semimartingales is stated but the proof is omitted. At the end of the chapter we discuss the quadratic variation process of a local martingale, a key concept in martin- gale theory based stochastic analysis. 1. Conditional expectation and conditional probability In this section, we review basic properties of conditional expectation. Let (W, F , P) be a probability space and G a s-algebra of measurable events contained in F . Suppose that X 2 L1(W, F , P), an integrable ran- dom variable. There exists a unique random variable Y which have the following two properties: (1) Y 2 L1(W, G , P), i.e., Y is measurable with respect to the s-algebra G and is integrable; (2) for any C 2 G , we have E fX; Cg = E fY; Cg . This random variable Y is called the conditional expectation of X with re- spect to G and is denoted by E fXjG g. The existence and uniqueness of conditional expectation is an easy con- sequence of the Radon-Nikodym theorem in real analysis. Define two mea- sures on (W, G ) by m fCg = E fX; Cg , n fCg = P fCg , C 2 G . It is clear that m is absolutely continuous with respect to n. The conditional expectation E fXjG g is precisely the Radon-Nikodym derivative dm/dn.
    [Show full text]
  • A Guide to Brownian Motion and Related Stochastic Processes
    Vol. 0 (0000) A guide to Brownian motion and related stochastic processes Jim Pitman and Marc Yor Dept. Statistics, University of California, 367 Evans Hall # 3860, Berkeley, CA 94720-3860, USA e-mail: [email protected] Abstract: This is a guide to the mathematical theory of Brownian mo- tion and related stochastic processes, with indications of how this theory is related to other branches of mathematics, most notably the classical the- ory of partial differential equations associated with the Laplace and heat operators, and various generalizations thereof. As a typical reader, we have in mind a student, familiar with the basic concepts of probability based on measure theory, at the level of the graduate texts of Billingsley [43] and Durrett [106], and who wants a broader perspective on the theory of Brow- nian motion and related stochastic processes than can be found in these texts. Keywords and phrases: Markov process, random walk, martingale, Gaus- sian process, L´evy process, diffusion. AMS 2000 subject classifications: Primary 60J65. Contents 1 Introduction................................. 3 1.1 History ................................ 3 1.2 Definitions............................... 4 2 BM as a limit of random walks . 5 3 BMasaGaussianprocess......................... 7 3.1 Elementarytransformations . 8 3.2 Quadratic variation . 8 3.3 Paley-Wiener integrals . 8 3.4 Brownianbridges........................... 10 3.5 FinestructureofBrownianpaths . 10 arXiv:1802.09679v1 [math.PR] 27 Feb 2018 3.6 Generalizations . 10 3.6.1 FractionalBM ........................ 10 3.6.2 L´evy’s BM . 11 3.6.3 Browniansheets ....................... 11 3.7 References............................... 11 4 BMasaMarkovprocess.......................... 12 4.1 Markovprocessesandtheirsemigroups. 12 4.2 ThestrongMarkovproperty. 14 4.3 Generators .............................
    [Show full text]
  • ENLARGEMENT of FILTRATION and the STRICT LOCAL MARTINGALE PROPERTY in STOCHASTIC DIFFERENTIAL EQUATIONS Aditi Dandapani COLUMBIA
    ENLARGEMENT OF FILTRATION AND THE STRICT LOCAL MARTINGALE PROPERTY IN STOCHASTIC DIFFERENTIAL EQUATIONS Aditi Dandapani Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate School of Arts and Sciences COLUMBIA UNIVERSITY 2016 © 2016 Aditi Dandapani All Rights Reserved Abstract ENLARGEMENT OF FILTRATION AND THE STRICT LOCAL MARTINGALE PROPERTY IN STOCHASTIC DIFFERENTIAL EQUATIONS Aditi Dandapani In this thesis, we study the strict local martingale property of solutions of various types of stochastic differential equations and the effect of an initial expansion of the filtration on this property. For the models we consider, we either use existing criteria or, in the case where the stochastic differential equation has jumps, develop new criteria that can can detect the presence of the strict local martingale property. We develop deterministic sufficient conditions on the drift and diffusion coefficient of the stochastic process such that an enlargement by initial expansion of the filtration can produce a strict local martingale from a true martingale. We also develop a way of characterizing the martingale property in stochastic volatility models where the local martingale has a general diffusion coefficient, of the form µ(St; vt);where the function µ(x1; x2) is locally Lipschitz in x: Contents Acknowledgements ii Introduction 1 Chapter 1 5 0.1 The One-Dimensional Case . .5 0.2 Stochastic Volatility Models . 13 0.3 Expansion of the Filtration by Initial Expansions . 31 Chapter 2 35 0.4 The Model of Lions and Musiela . 35 0.5 The Case of Jump Discontinuities . 62 0.6 The Model of Mijatovic and Urusov .
    [Show full text]
  • Itô's Stochastic Calculus
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Elsevier - Publisher Connector Stochastic Processes and their Applications 120 (2010) 622–652 www.elsevier.com/locate/spa Ito’sˆ stochastic calculus: Its surprising power for applications Hiroshi Kunita Kyushu University Available online 1 February 2010 Abstract We trace Ito’sˆ early work in the 1940s, concerning stochastic integrals, stochastic differential equations (SDEs) and Ito’sˆ formula. Then we study its developments in the 1960s, combining it with martingale theory. Finally, we review a surprising application of Ito’sˆ formula in mathematical finance in the 1970s. Throughout the paper, we treat Ito’sˆ jump SDEs driven by Brownian motions and Poisson random measures, as well as the well-known continuous SDEs driven by Brownian motions. c 2010 Elsevier B.V. All rights reserved. MSC: 60-03; 60H05; 60H30; 91B28 Keywords: Ito’sˆ formula; Stochastic differential equation; Jump–diffusion; Black–Scholes equation; Merton’s equation 0. Introduction This paper is written for Kiyosi Itoˆ of blessed memory. Ito’sˆ famous work on stochastic integrals and stochastic differential equations started in 1942, when mathematicians in Japan were completely isolated from the world because of the war. During that year, he wrote a colloquium report in Japanese, where he presented basic ideas for the study of diffusion processes and jump–diffusion processes. Most of them were completed and published in English during 1944–1951. Ito’sˆ work was observed with keen interest in the 1960s both by probabilists and applied mathematicians. Ito’sˆ stochastic integrals based on Brownian motion were extended to those E-mail address: [email protected].
    [Show full text]
  • STAT331 Some Key Results for Counting Process Martingales This Section Develops Some Key Results for Martingale Processes. We Be
    STAT331 Some Key Results for Counting Process Martingales This section develops some key results for martingale processes. We begin def by considering the process M(·) = N(·) − A(·), where N(·) is the indicator process of whether an individual has been observed to fail, and A(·) is the compensator process introduced in the last unit. We show that M(·) is a zero mean martingale. Because it is constructed from a counting process, it is referred to as a counting process martingale. We then introduce the Doob-Meyer decomposition, an important theorem about the existence of compensator processes. We end by defining the quadratic variation process for a martingale, which is useful for describing its covariance function, and give a theorem that shows what this simplifies to when the compensator pro- cess is continuous. Recall the definition of a martingale process: Definition: The right-continuous stochastic processes X(·), with left-hand limits, is a Martingale w.r.t the filtration (Ft : t ≥ 0) if it is adapted and (a) E j X(t) j< 1 8t, and a:s: (b) E [X(t + s)jFt] = X(t) 8s; t ≥ 0. X(·) is a sub-martingale if above holds but with \=" in (b) replaced by \≥"; called a super-martingale if \=" replaced by \≤". 1 Let's discuss some aspects of this definition and its consequences: • For the processes we will consider, the left hand limits of X(·) will al- ways exist. • Processes whose sample paths are a.s. right-continuous with left-hand limits are called cadlag processes, from the French continu a droite, limite a gauche.
    [Show full text]