Semimartingales and Stochastic Integration Spring 2011
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Integral Representations of Martingales for Progressive Enlargements Of
INTEGRAL REPRESENTATIONS OF MARTINGALES FOR PROGRESSIVE ENLARGEMENTS OF FILTRATIONS ANNA AKSAMIT, MONIQUE JEANBLANC AND MAREK RUTKOWSKI Abstract. We work in the setting of the progressive enlargement G of a reference filtration F through the observation of a random time τ. We study an integral representation property for some classes of G-martingales stopped at τ. In the first part, we focus on the case where F is a Poisson filtration and we establish a predictable representation property with respect to three G-martingales. In the second part, we relax the assumption that F is a Poisson filtration and we assume that τ is an F-pseudo-stopping time. We establish integral representations with respect to some G-martingales built from F-martingales and, under additional hypotheses, we obtain a predictable representation property with respect to two G-martingales. Keywords: predictable representation property, Poisson process, random time, progressive enlargement, pseudo-stopping time Mathematics Subjects Classification (2010): 60H99 1. Introduction We are interested in the stability of the predictable representation property in a filtration en- largement setting: under the postulate that the predictable representation property holds for F and G is an enlargement of F, the goal is to find under which conditions the predictable rep- G G resentation property holds for . We focus on the progressive enlargement = (Gt)t∈R+ of a F reference filtration = (Ft)t∈R+ through the observation of the occurrence of a random time τ and we assume that the hypothesis (H′) is satisfied, i.e., any F-martingale is a G-semimartingale. It is worth noting that no general result on the existence of a G-semimartingale decomposition after τ of an F-martingale is available in the existing literature, while, for any τ, any F-martingale stopped at time τ is a G-semimartingale with an explicit decomposition depending on the Az´ema supermartingale of τ. -
MAST31801 Mathematical Finance II, Spring 2019. Problem Sheet 8 (4-8.4.2019)
UH|Master program Mathematics and Statistics|MAST31801 Mathematical Finance II, Spring 2019. Problem sheet 8 (4-8.4.2019) 1. Show that a local martingale (Xt : t ≥ 0) in a filtration F, which is bounded from below is a supermartingale. Hint: use localization together with Fatou lemma. 2. A non-negative local martingale Xt ≥ 0 is a martingale if and only if E(Xt) = E(X0) 8t. 3. Recall Ito formula, for f(x; s) 2 C2;1 and a continuous semimartingale Xt with quadratic covariation hXit 1 Z t @2f Z t @f Z t @f f(Xt; t) − f(X0; 0) − 2 (Xs; s)dhXis − (Xs; s)ds = (Xs; s)dXs 2 0 @x 0 @s 0 @x where on the right side we have the Ito integral. We can also understand the Ito integral as limit in probability of Riemann sums X @f n n n n X(tk−1); tk−1 X(tk ^ t) − X(tk−1 ^ t) n n @x tk 2Π among a sequence of partitions Πn with ∆(Πn) ! 0. Recalling that the Brownian motion Wt has quadratic variation hW it = t, write the following Ito integrals explicitely by using Ito formula: R t n (a) 0 Ws dWs, n 2 N. R t (b) 0 exp σWs σdWs R t 2 (c) 0 exp σWs − σ s=2 σdWs R t (d) 0 sin(σWs)dWs R t (e) 0 cos(σWs)dWs 4. These Ito integrals as stochastic processes are semimartingales. Com- pute the quadratic variation of the Ito-integrals above. Hint: recall that the finite variation part of a semimartingale has zero quadratic variation, and the quadratic variation is bilinear. -
CONFORMAL INVARIANCE and 2 D STATISTICAL PHYSICS −
CONFORMAL INVARIANCE AND 2 d STATISTICAL PHYSICS − GREGORY F. LAWLER Abstract. A number of two-dimensional models in statistical physics are con- jectured to have scaling limits at criticality that are in some sense conformally invariant. In the last ten years, the rigorous understanding of such limits has increased significantly. I give an introduction to the models and one of the major new mathematical structures, the Schramm-Loewner Evolution (SLE). 1. Critical Phenomena Critical phenomena in statistical physics refers to the study of systems at or near the point at which a phase transition occurs. There are many models of such phenomena. We will discuss some discrete equilibrium models that are defined on a lattice. These are measures on paths or configurations where configurations are weighted by their energy with a preference for paths of smaller energy. These measures depend on at leastE one parameter. A standard parameter in physics β = c/T where c is a fixed constant, T stands for temperature and the measure given to a configuration is e−βE . Phase transitions occur at critical values of the temperature corresponding to, e.g., the transition from a gaseous to liquid state. We use β for the parameter although for understanding the literature it is useful to know that large values of β correspond to “low temperature” and small values of β correspond to “high temperature”. Small β (high temperature) systems have weaker correlations than large β (low temperature) systems. In a number of models, there is a critical βc such that qualitatively the system has three regimes β<βc (high temperature), β = βc (critical) and β>βc (low temperature). -
Applied Probability
ISSN 1050-5164 (print) ISSN 2168-8737 (online) THE ANNALS of APPLIED PROBABILITY AN OFFICIAL JOURNAL OF THE INSTITUTE OF MATHEMATICAL STATISTICS Articles Optimal control of branching diffusion processes: A finite horizon problem JULIEN CLAISSE 1 Change point detection in network models: Preferential attachment and long range dependence . SHANKAR BHAMIDI,JIMMY JIN AND ANDREW NOBEL 35 Ergodic theory for controlled Markov chains with stationary inputs YUE CHEN,ANA BUŠIC´ AND SEAN MEYN 79 Nash equilibria of threshold type for two-player nonzero-sum games of stopping TIZIANO DE ANGELIS,GIORGIO FERRARI AND JOHN MORIARTY 112 Local inhomogeneous circular law JOHANNES ALT,LÁSZLÓ ERDOS˝ AND TORBEN KRÜGER 148 Diffusion approximations for controlled weakly interacting large finite state systems withsimultaneousjumps.........AMARJIT BUDHIRAJA AND ERIC FRIEDLANDER 204 Duality and fixation in -Wright–Fisher processes with frequency-dependent selection ADRIÁN GONZÁLEZ CASANOVA AND DARIO SPANÒ 250 CombinatorialLévyprocesses.......................................HARRY CRANE 285 Eigenvalue versus perimeter in a shape theorem for self-interacting random walks MAREK BISKUP AND EVIATAR B. PROCACCIA 340 Volatility and arbitrage E. ROBERT FERNHOLZ,IOANNIS KARATZAS AND JOHANNES RUF 378 A Skorokhod map on measure-valued paths with applications to priority queues RAMI ATAR,ANUP BISWAS,HAYA KASPI AND KAV I TA RAMANAN 418 BSDEs with mean reflection . PHILIPPE BRIAND,ROMUALD ELIE AND YING HU 482 Limit theorems for integrated local empirical characteristic exponents from noisy high-frequency data with application to volatility and jump activity estimation JEAN JACOD AND VIKTOR TODOROV 511 Disorder and wetting transition: The pinned harmonic crystal indimensionthreeorlarger.....GIAMBATTISTA GIACOMIN AND HUBERT LACOIN 577 Law of large numbers for the largest component in a hyperbolic model ofcomplexnetworks..........NIKOLAOS FOUNTOULAKIS AND TOBIAS MÜLLER 607 Vol. -
Black-Scholes Model
Chapter 5 Black-Scholes Model Copyright c 2008{2012 Hyeong In Choi, All rights reserved. 5.1 Modeling the Stock Price Process One of the basic building blocks of the Black-Scholes model is the stock price process. In particular, in the Black-Scholes world, the stock price process, denoted by St, is modeled as a geometric Brow- nian motion satisfying the following stochastic differential equation: dSt = St(µdt + σdWt); where µ and σ are constants called the drift and volatility, respec- tively. Also, σ is always assumed to be a positive constant. In order to see if such model is \reasonable," let us look at the price data of KOSPI 200 index. In particular Figure 5.1 depicts the KOSPI 200 index from January 2, 2001 till May 18, 2011. If it is reasonable to model it as a geometric Brownian motion, we must have ∆St ≈ µ∆t + σ∆Wt: St In particular if we let ∆t = h (so that ∆St = St+h − St and ∆Wt = Wt+h − Wt), the above approximate equality becomes St+h − St ≈ µh + σ(Wt+h − Wt): St 2 Since σ(Wt+h − Wt) ∼ N(0; σ h), the histogram of (St+h − St)p=St should resemble a normal distribution with standard deviation σ h. (It is called the h-day volatility.) To see it is indeed the case we have plotted the histograms for h = 1; 2; 3; 5; 7 (days) in Figures 5.2 up to 5.1. MODELING THE STOCK PRICE PROCESS 156 Figure 5.1: KOSPI 200 from 2001. -
Superprocesses and Mckean-Vlasov Equations with Creation of Mass
Sup erpro cesses and McKean-Vlasov equations with creation of mass L. Overb eck Department of Statistics, University of California, Berkeley, 367, Evans Hall Berkeley, CA 94720, y U.S.A. Abstract Weak solutions of McKean-Vlasov equations with creation of mass are given in terms of sup erpro cesses. The solutions can b e approxi- mated by a sequence of non-interacting sup erpro cesses or by the mean- eld of multityp e sup erpro cesses with mean- eld interaction. The lat- ter approximation is asso ciated with a propagation of chaos statement for weakly interacting multityp e sup erpro cesses. Running title: Sup erpro cesses and McKean-Vlasov equations . 1 Intro duction Sup erpro cesses are useful in solving nonlinear partial di erential equation of 1+ the typ e f = f , 2 0; 1], cf. [Dy]. Wenowchange the p oint of view and showhowtheyprovide sto chastic solutions of nonlinear partial di erential Supp orted byanFellowship of the Deutsche Forschungsgemeinschaft. y On leave from the Universitat Bonn, Institut fur Angewandte Mathematik, Wegelerstr. 6, 53115 Bonn, Germany. 1 equation of McKean-Vlasovtyp e, i.e. wewant to nd weak solutions of d d 2 X X @ @ @ + d x; + bx; : 1.1 = a x; t i t t t t t ij t @t @x @x @x i j i i=1 i;j =1 d Aweak solution = 2 C [0;T];MIR satis es s Z 2 t X X @ @ a f = f + f + d f + b f ds: s ij s t 0 i s s @x @x @x 0 i j i Equation 1.1 generalizes McKean-Vlasov equations of twodi erenttyp es. -
Stochastic Differential Equations
Stochastic Differential Equations Stefan Geiss November 25, 2020 2 Contents 1 Introduction 5 1.0.1 Wolfgang D¨oblin . .6 1.0.2 Kiyoshi It^o . .7 2 Stochastic processes in continuous time 9 2.1 Some definitions . .9 2.2 Two basic examples of stochastic processes . 15 2.3 Gaussian processes . 17 2.4 Brownian motion . 31 2.5 Stopping and optional times . 36 2.6 A short excursion to Markov processes . 41 3 Stochastic integration 43 3.1 Definition of the stochastic integral . 44 3.2 It^o'sformula . 64 3.3 Proof of Ito^'s formula in a simple case . 76 3.4 For extended reading . 80 3.4.1 Local time . 80 3.4.2 Three-dimensional Brownian motion is transient . 83 4 Stochastic differential equations 87 4.1 What is a stochastic differential equation? . 87 4.2 Strong Uniqueness of SDE's . 90 4.3 Existence of strong solutions of SDE's . 94 4.4 Theorems of L´evyand Girsanov . 98 4.5 Solutions of SDE's by a transformation of drift . 103 4.6 Weak solutions . 105 4.7 The Cox-Ingersoll-Ross SDE . 108 3 4 CONTENTS 4.8 The martingale representation theorem . 116 5 BSDEs 121 5.1 Introduction . 121 5.2 Setting . 122 5.3 A priori estimate . 123 Chapter 1 Introduction One goal of the lecture is to study stochastic differential equations (SDE's). So let us start with a (hopefully) motivating example: Assume that Xt is the share price of a company at time t ≥ 0 where we assume without loss of generality that X0 := 1. -
A Stochastic Processes and Martingales
A Stochastic Processes and Martingales A.1 Stochastic Processes Let I be either IINorIR+.Astochastic process on I with state space E is a family of E-valued random variables X = {Xt : t ∈ I}. We only consider examples where E is a Polish space. Suppose for the moment that I =IR+. A stochastic process is called cadlag if its paths t → Xt are right-continuous (a.s.) and its left limits exist at all points. In this book we assume that every stochastic process is cadlag. We say a process is continuous if its paths are continuous. The above conditions are meant to hold with probability 1 and not to hold pathwise. A.2 Filtration and Stopping Times The information available at time t is expressed by a σ-subalgebra Ft ⊂F.An {F ∈ } increasing family of σ-algebras t : t I is called a filtration.IfI =IR+, F F F we call a filtration right-continuous if t+ := s>t s = t. If not stated otherwise, we assume that all filtrations in this book are right-continuous. In many books it is also assumed that the filtration is complete, i.e., F0 contains all IIP-null sets. We do not assume this here because we want to be able to change the measure in Chapter 4. Because the changed measure and IIP will be singular, it would not be possible to extend the new measure to the whole σ-algebra F. A stochastic process X is called Ft-adapted if Xt is Ft-measurable for all t. If it is clear which filtration is used, we just call the process adapted.The {F X } natural filtration t is the smallest right-continuous filtration such that X is adapted. -
PREDICTABLE REPRESENTATION PROPERTY for PROGRESSIVE ENLARGEMENTS of a POISSON FILTRATION Anna Aksamit, Monique Jeanblanc, Marek Rutkowski
PREDICTABLE REPRESENTATION PROPERTY FOR PROGRESSIVE ENLARGEMENTS OF A POISSON FILTRATION Anna Aksamit, Monique Jeanblanc, Marek Rutkowski To cite this version: Anna Aksamit, Monique Jeanblanc, Marek Rutkowski. PREDICTABLE REPRESENTATION PROPERTY FOR PROGRESSIVE ENLARGEMENTS OF A POISSON FILTRATION. 2015. hal- 01249662 HAL Id: hal-01249662 https://hal.archives-ouvertes.fr/hal-01249662 Preprint submitted on 2 Jan 2016 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. PREDICTABLE REPRESENTATION PROPERTY FOR PROGRESSIVE ENLARGEMENTS OF A POISSON FILTRATION Anna Aksamit Mathematical Institute, University of Oxford, Oxford OX2 6GG, United Kingdom Monique Jeanblanc∗ Laboratoire de Math´ematiques et Mod´elisation d’Evry´ (LaMME), Universit´ed’Evry-Val-d’Essonne,´ UMR CNRS 8071 91025 Evry´ Cedex, France Marek Rutkowski School of Mathematics and Statistics University of Sydney Sydney, NSW 2006, Australia 10 December 2015 Abstract We study problems related to the predictable representation property for a progressive en- largement G of a reference filtration F through observation of a finite random time τ. We focus on cases where the avoidance property and/or the continuity property for F-martingales do not hold and the reference filtration is generated by a Poisson process. -
Derivatives of Self-Intersection Local Times
Derivatives of self-intersection local times Jay Rosen? Department of Mathematics College of Staten Island, CUNY Staten Island, NY 10314 e-mail: [email protected] Summary. We show that the renormalized self-intersection local time γt(x) for both the Brownian motion and symmetric stable process in R1 is differentiable in 0 the spatial variable and that γt(0) can be characterized as the continuous process of zero quadratic variation in the decomposition of a natural Dirichlet process. This Dirichlet process is the potential of a random Schwartz distribution. Analogous results for fractional derivatives of self-intersection local times in R1 and R2 are also discussed. 1 Introduction In their study of the intrinsic Brownian local time sheet and stochastic area integrals for Brownian motion, [14, 15, 16], Rogers and Walsh were led to analyze the functional Z t A(t, Bt) = 1[0,∞)(Bt − Bs) ds (1) 0 where Bt is a 1-dimensional Brownian motion. They showed that A(t, Bt) is not a semimartingale, and in fact showed that Z t Bs A(t, Bt) − Ls dBs (2) 0 x has finite non-zero 4/3-variation. Here Ls is the local time at x, which is x R s formally Ls = 0 δ(Br − x) dr, where δ(x) is Dirac’s ‘δ-function’. A formal d d2 0 application of Ito’s lemma, using dx 1[0,∞)(x) = δ(x) and dx2 1[0,∞)(x) = δ (x), yields Z t 1 Z t Z s Bs 0 A(t, Bt) − Ls dBs = t + δ (Bs − Br) dr ds (3) 0 2 0 0 ? This research was supported, in part, by grants from the National Science Foun- dation and PSC-CUNY. -
Lecture 4: Risk Neutral Pricing 1 Part I: the Girsanov Theorem
SEEM 5670 { Advanced Models in Financial Engineering Professor: Nan Chen Oct. 17 &18, 2017 Lecture 4: Risk Neutral Pricing 1 Part I: The Girsanov Theorem 1.1 Change of Measure and Girsanov Theorem • Change of measure for a single random variable: Theorem 1. Let (Ω; F;P ) be a sample space and Z be an almost surely nonnegative random variable with E[Z] = 1. For A 2 F, define Z P~(A) = Z(!)dP (!): A Then P~ is a probability measure. Furthermore, if X is a nonnegative random variable, then E~[X] = E[XZ]: If Z is almost surely strictly positive, we also have Y E[Y ] = E~ : Z Definition 1. Let (Ω; F) be a sample space. Two probability measures P and P~ on (Ω; F) are said to be equivalent if P (A) = 0 , P~(A) = 0 for all such A. Theorem 2 (Radon-Nikodym). Let P and P~ be equivalent probability measures defined on (Ω; F). Then there exists an almost surely positive random variable Z such that E[Z] = 1 and Z P~(A) = Z(!)dP (!) A for A 2 F. We say Z is the Radon-Nikodym derivative of P~ with respect to P , and we write dP~ Z = : dP Example 1. Change of measure on Ω = [0; 1]. Example 2. Change of measure for normal random variables. • We can also perform change of measure for a whole process rather than for a single random variable. Suppose that there is a probability space (Ω; F) and a filtration fFt; 0 ≤ t ≤ T g. Suppose further that FT -measurable Z is an almost surely positive random variable satisfying E[Z] = 1. -
Stochastic Differential Equations with Applications
Stochastic Differential Equations with Applications December 4, 2012 2 Contents 1 Stochastic Differential Equations 7 1.1 Introduction . 7 1.1.1 Meaning of Stochastic Differential Equations . 8 1.2 Some applications of SDEs . 10 1.2.1 Asset prices . 10 1.2.2 Population modelling . 10 1.2.3 Multi-species models . 11 2 Continuous Random Variables 13 2.1 The probability density function . 13 2.1.1 Change of independent deviates . 13 2.1.2 Moments of a distribution . 13 2.2 Ubiquitous probability density functions in continuous finance . 14 2.2.1 Normal distribution . 14 2.2.2 Log-normal distribution . 14 2.2.3 Gamma distribution . 14 2.3 Limit Theorems . 15 2.3.1 The central limit results . 17 2.4 The Wiener process . 17 2.4.1 Covariance . 18 2.4.2 Derivative of a Wiener process . 18 2.4.3 Definitions . 18 3 Review of Integration 21 3 4 CONTENTS 3.1 Bounded Variation . 21 3.2 Riemann integration . 23 3.3 Riemann-Stieltjes integration . 24 3.4 Stochastic integration of deterministic functions . 25 4 The Ito and Stratonovich Integrals 27 4.1 A simple stochastic differential equation . 27 4.1.1 Motivation for Ito integral . 28 4.2 Stochastic integrals . 29 4.3 The Ito integral . 31 4.3.1 Application . 33 4.4 The Stratonovich integral . 33 4.4.1 Relationship between the Ito and Stratonovich integrals . 35 4.4.2 Stratonovich integration conforms to the classical rules of integration . 37 4.5 Stratonovich representation on an SDE . 38 5 Differentiation of functions of stochastic variables 41 5.1 Ito's Lemma .