1. Stochastic Processes and Filtrations

Total Page:16

File Type:pdf, Size:1020Kb

1. Stochastic Processes and Filtrations 1. Stochastic Processes and filtrations 1. Stoch. pr., filtrations A stochastic process (Xt )t2T is a collection of random variables on 2. BM as (Ω; F) with values in a measurable space (S; S), i.e., for all t, weak limit 3. Gaussian p. Xt :Ω ! S is F − S-measurable. 4. Stopping times In our case 5. Cond. expectation 1 T = f0; 1;:::; Ng or T = N [ f0g (discrete time) 6. Martingales 2 or T = [0; 1) (continuous time). 7. Discrete stoch. integral The state space will be (S; S) = (R; B(R)) (or (S; S) = (Rd ; B(Rd )) 8. Refl. princ., pass.times for some d ≥ 2). 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula Example 1.1 Random walk: 1. Stoch. pr., Let "n, n ≥ 1, be a sequence of iid random variables with filtrations 2. BM as ( 1 weak limit 1 with probability 2 "n = 1 3. Gaussian p. −1 with probability 2 4. Stopping times Define X~0 = 0 and, for k = 1; 2;::: , 5. Cond. expectation k 6. Martingales X X~k = "i 7. Discrete stoch. integral i=1 8. Refl. princ., pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula 1. Stoch. pr., filtrations 2. BM as weak limit For a fixed ! 2 Ω the function 3. Gaussian p. 4. Stopping t 7! Xt (!) times 5. Cond. expectation is called (sample) path (or realization) associated to ! of the 6. Martingales stochastic process X . 7. Discrete stoch. integral 8. Refl. princ., pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula Let X and Y be two stochastic processes on (Ω; F). 1. Stoch. pr., Definition 1.2 filtrations 2. BM as (i) X and Y have the same finite-dimensional distributions if, for weak limit n all n ≥ 1, for all t1;:::; tn 2 [0; 1) and A 2 B(R ): 3. Gaussian p. 4. Stopping P((X ; X ;:::; X ) 2 A) = P((Y ; Y ;:::; Y ) 2 A): times t1 t2 tn t1 t2 tn 5. Cond. expectation (ii) X and Y are modifications of each other, if, for each t 2 [0; 1), 6. Martingales 7. Discrete P(Xt = Yt ) = 1: stoch. integral 8. Refl. princ., pass.times (iii) X and Y are indistinguishable, if 9. Quad.var., path prop. P(Xt = Yt for every t 2 [0; 1)) = 1: 10. Ito integral 11. Ito's formula Example 1.3 1. Stoch. pr., filtrations 2. BM as weak limit 3. Gaussian p. 4. Stopping times 5. Cond. expectation 6. Martingales 7. Discrete stoch. integral 8. Refl. princ., pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula Definition 1.4 A stochastic process X is called continuous (right continuous), if 1. Stoch. pr., almost all paths are continuous (right continuous), i.e., filtrations 2. BM as P(f! : t 7! Xt (!) continuous (right continuous)g) = 1 weak limit 3. Gaussian p. 4. Stopping times Proposition 1.5 5. Cond. expectation If X and Y are modifications of each other and if both processes are 6. Martingales a.s. right continuous then X and Y are indistinguishable. 7. Discrete stoch. integral 8. Refl. princ., pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula 1. Stoch. pr., Let (Ω; F) be given and fF ; t ≥ 0g be an increasing family of filtrations t σ-algebras (i.e., for s < t, F ⊆ F ), such that F ⊆ F for all t. 2. BM as s t t weak limit That's a 3. Gaussian p. filtration: 4. Stopping times For example: 5. Cond. expectation Definition 1.6 X 6. Martingales The filtration Ft generated by a stochastic process X is given by 7. Discrete stoch. integral X Ft = σfXs ; s ≤ tg: 8. Refl. princ., pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula 1. Stoch. pr., filtrations 2. BM as weak limit Definition 1.7 3. Gaussian p. A stochastic process is adapted to a filtration (Ft )t≥0 if, for each 4. Stopping times t ≥ 0 5. Cond. Xt : (Ω; Ft ) ! (R; B(R)) expectation 6. Martingales is measurable. (Short: Xt is Ft -measurable.) 7. Discrete stoch. integral 8. Refl. princ., pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula 1. Stoch. pr., filtrations 2. BM as Definition 1.8 ("The usual conditions") weak limit 3. Gaussian p. Let Ft+ = \">0Ft+". We say that a filtration (Ft )t≥0 satisfies the 4. Stopping usual conditions if times 5. Cond. 1 (Ft )t≥0 is right-continuous, that is, Ft+ = Ft , for all t ≥ 0. expectation 2 6. Martingales (Ft )t≥0 is complete. That is, F0 contains all subsets of P-nullsets. 7. Discrete stoch. integral 8. Refl. princ., pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula 2. Brownian motion as a weak limit of random walks Example 2.1 k Scaled random walk: for tk = N , k = 0; 1; 2;::: , we set 1. Stoch. pr., filtrations k 2. BM as ~ N 1 ~ 1 X weak limit X := p Xk = p "i tk N N 3. Gaussian p. i=1 4. Stopping times 5. Cond. expectation 6. Martingales 7. Discrete stoch. integral 8. Refl. princ., pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula Definition 2.2 (Brownian motion) 1. Stoch. pr., filtrations A standard, one-dimensional Brownian motion is a continuous 2. BM as adapted process W on (Ω; F; (Ft )t≥0; P) with W0 = 0 a.s. such weak limit that, for all 0 ≤ s < t, 3. Gaussian p. 1 Wt − Ws is independent of Fs 4. Stopping times 2 Wt − Ws is normally distributed with mean 0 and variance t − s. 5. Cond. expectation 6. Martingales 7. Discrete stoch. integral 8. Refl. princ., pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula k Recall the scaled random walk: for tk = N , k = 0; 1; 2;::: , we set k 1 1 X X~ N := p X~ = p " 1. Stoch. pr., tk k i filtrations N N i=1 2. BM as weak limit By interpolation we define a continuous process on [0; 1): 3. Gaussian p. 0[Nt] 1 N 1 X 4. Stopping Xt = p @ "i + (Nt − [Nt])"[Nt]+1A (2.1) times N i=1 5. Cond. expectation 6. Martingales 7. Discrete stoch. integral 8. Refl. princ., pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula 1. Stoch. pr., filtrations 2. BM as weak limit 3. Gaussian p. Theorem 2.3 (Donsker) 4. Stopping times Let (Ω; F; P) be a probability space with a sequence of iid random variables with mean 0 and variance 1. Define (X N ) as in (2.1). 5. Cond. t t≥0 N expectation Then (Xt )t≥0 converges to (Wt )t≥0 weakly. 6. Martingales 7. Discrete stoch. integral 8. Refl. princ., pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula Clarification of the statement: 1 Ω~ = C[0; 1) 1. Stoch. pr., ~ ∗ filtrations 2 On Ω = C[0; 1) one can construct a probability measure P (the Wiener measure) such that the canonical process 2. BM as weak limit (Wt )t≥0 is a Brownian motion. Let ! = !(t)t≥0 2 C[0; 1). 3. Gaussian p. Then the canonical process is given as Wt (!) := !(t). 4. Stopping N N ~ times 3 (Xt )t≥0 defines a measure probability P on Ω = C[0; 1). 5. Cond. Indeed, expectation X N : (Ω; F) ! (C[0; 1); B(C[0; 1)) 6. Martingales measurable, that means we consider the random variable 7. Discrete N stoch. integral ! 7! X (!; ·), which maps ! to a continous function in t (i.e. 8. Refl. princ., the path). Let A 2 B(C[0; 1)), then pass.times N N 9. Quad.var., P (A) := P(f! : X (!; · 2 Ag): path prop. 10. Ito integral 4 So, Donsker's theorem says that on N w ∗ 11. Ito's (Ω~; F~) = (C[0; 1); B(C[0; 1))) we have that P −! P . formula 1. Stoch. pr., Steps of the proof: filtrations N 2. BM as 1 Show that the finite-dimensional distributions of X converge weak limit to the finite-dimensional distributions of W . That means, show 3. Gaussian p. that, for all n ≥ 1, and all s1;:::; sn 2 [0; 1) 4. Stopping times (X N ; X N ;:::; X N ) −!w (W ; W ;:::; W ): 5. Cond. s1 s2 sn s1 s2 sn expectation N 6. Martingales 2 Show that (P )N≥1 is a tight familiy of probability measures on 7. Discrete (Ω~; F~). stoch. integral w 8. Refl. princ., 3 (1) and (2) imply PN −! P∗. pass.times 9. Quad.var., path prop. 10. Ito integral 11. Ito's formula Easy partial result of step (1) of the proof: By (CLT), for fixed t 2 [0; 1), for N ! 1, 1. Stoch. pr., filtrations N w Xt −! Wt 2. BM as weak limit Theorem 2.4 (CLT) 3. Gaussian p. 2 4. Stopping Let (ξn)n≥1 be iid, mean=0, variance=σ . Then, for each x 2 R, times 5. Cond. n ! expectation 1 X lim P p ξi ≤ x = Φσ(x); 6. Martingales n!1 n i=1 7. Discrete stoch. integral 2 where Φσ(x) is the distribution function of N(0; σ ). In other words 8. Refl. princ., pass.times n 1 X w 9.
Recommended publications
  • Stochastic Differential Equations
    Stochastic Differential Equations Stefan Geiss November 25, 2020 2 Contents 1 Introduction 5 1.0.1 Wolfgang D¨oblin . .6 1.0.2 Kiyoshi It^o . .7 2 Stochastic processes in continuous time 9 2.1 Some definitions . .9 2.2 Two basic examples of stochastic processes . 15 2.3 Gaussian processes . 17 2.4 Brownian motion . 31 2.5 Stopping and optional times . 36 2.6 A short excursion to Markov processes . 41 3 Stochastic integration 43 3.1 Definition of the stochastic integral . 44 3.2 It^o'sformula . 64 3.3 Proof of Ito^'s formula in a simple case . 76 3.4 For extended reading . 80 3.4.1 Local time . 80 3.4.2 Three-dimensional Brownian motion is transient . 83 4 Stochastic differential equations 87 4.1 What is a stochastic differential equation? . 87 4.2 Strong Uniqueness of SDE's . 90 4.3 Existence of strong solutions of SDE's . 94 4.4 Theorems of L´evyand Girsanov . 98 4.5 Solutions of SDE's by a transformation of drift . 103 4.6 Weak solutions . 105 4.7 The Cox-Ingersoll-Ross SDE . 108 3 4 CONTENTS 4.8 The martingale representation theorem . 116 5 BSDEs 121 5.1 Introduction . 121 5.2 Setting . 122 5.3 A priori estimate . 123 Chapter 1 Introduction One goal of the lecture is to study stochastic differential equations (SDE's). So let us start with a (hopefully) motivating example: Assume that Xt is the share price of a company at time t ≥ 0 where we assume without loss of generality that X0 := 1.
    [Show full text]
  • Notes on Elementary Martingale Theory 1 Conditional Expectations
    . Notes on Elementary Martingale Theory by John B. Walsh 1 Conditional Expectations 1.1 Motivation Probability is a measure of ignorance. When new information decreases that ignorance, it changes our probabilities. Suppose we roll a pair of dice, but don't look immediately at the outcome. The result is there for anyone to see, but if we haven't yet looked, as far as we are concerned, the probability that a two (\snake eyes") is showing is the same as it was before we rolled the dice, 1/36. Now suppose that we happen to see that one of the two dice shows a one, but we do not see the second. We reason that we have rolled a two if|and only if|the unseen die is a one. This has probability 1/6. The extra information has changed the probability that our roll is a two from 1/36 to 1/6. It has given us a new probability, which we call a conditional probability. In general, if A and B are events, we say the conditional probability that B occurs given that A occurs is the conditional probability of B given A. This is given by the well-known formula P A B (1) P B A = f \ g; f j g P A f g providing P A > 0. (Just to keep ourselves out of trouble if we need to apply this to a set f g of probability zero, we make the convention that P B A = 0 if P A = 0.) Conditional probabilities are familiar, but that doesn't stop themf fromj g giving risef tog many of the most puzzling paradoxes in probability.
    [Show full text]
  • The Theory of Filtrations of Subalgebras, Standardness and Independence
    The theory of filtrations of subalgebras, standardness and independence Anatoly M. Vershik∗ 24.01.2017 In memory of my friend Kolya K. (1933{2014) \Independence is the best quality, the best word in all languages." J. Brodsky. From a personal letter. Abstract The survey is devoted to the combinatorial and metric theory of fil- trations, i. e., decreasing sequences of σ-algebras in measure spaces or decreasing sequences of subalgebras of certain algebras. One of the key notions, that of standardness, plays the role of a generalization of the no- tion of the independence of a sequence of random variables. We discuss the possibility of obtaining a classification of filtrations, their invariants, and various links to problems in algebra, dynamics, and combinatorics. Bibliography: 101 titles. Contents 1 Introduction 3 1.1 A simple example and a difficult question . .3 1.2 Three languages of measure theory . .5 1.3 Where do filtrations appear? . .6 1.4 Finite and infinite in classification problems; standardness as a generalization of independence . .9 arXiv:1705.06619v1 [math.DS] 18 May 2017 1.5 A summary of the paper . 12 2 The definition of filtrations in measure spaces 13 2.1 Measure theory: a Lebesgue space, basic facts . 13 2.2 Measurable partitions, filtrations . 14 2.3 Classification of measurable partitions . 16 ∗St. Petersburg Department of Steklov Mathematical Institute; St. Petersburg State Uni- versity Institute for Information Transmission Problems. E-mail: [email protected]. Par- tially supported by the Russian Science Foundation (grant No. 14-11-00581). 1 2.4 Classification of finite filtrations . 17 2.5 Filtrations we consider and how one can define them .
    [Show full text]
  • An Application of Probability Theory in Finance: the Black-Scholes Formula
    AN APPLICATION OF PROBABILITY THEORY IN FINANCE: THE BLACK-SCHOLES FORMULA EFFY FANG Abstract. In this paper, we start from the building blocks of probability the- ory, including σ-field and measurable functions, and then proceed to a formal introduction of probability theory. Later, we introduce stochastic processes, martingales and Wiener processes in particular. Lastly, we present an appli- cation - the Black-Scholes Formula, a model used to price options in financial markets. Contents 1. The Building Blocks 1 2. Probability 4 3. Martingales 7 4. Wiener Processes 8 5. The Black-Scholes Formula 9 References 14 1. The Building Blocks Consider the set Ω of all possible outcomes of a random process. When the set is small, for example, the outcomes of tossing a coin, we can analyze case by case. However, when the set gets larger, we would need to study the collections of subsets of Ω that may be taken as a suitable set of events. We define such a collection as a σ-field. Definition 1.1. A σ-field on a set Ω is a collection F of subsets of Ω which obeys the following rules or axioms: (a) Ω 2 F (b) if A 2 F, then Ac 2 F 1 S1 (c) if fAngn=1 is a sequence in F, then n=1 An 2 F The pair (Ω; F) is called a measurable space. Proposition 1.2. Let (Ω; F) be a measurable space. Then (1) ; 2 F k SK (2) if fAngn=1 is a finite sequence of F-measurable sets, then n=1 An 2 F 1 T1 (3) if fAngn=1 is a sequence of F-measurable sets, then n=1 An 2 F Date: August 28,2015.
    [Show full text]
  • Mathematical Finance an Introduction in Discrete Time
    Jan Kallsen Mathematical Finance An introduction in discrete time CAU zu Kiel, WS 13/14, February 10, 2014 Contents 0 Some notions from stochastics 4 0.1 Basics . .4 0.1.1 Probability spaces . .4 0.1.2 Random variables . .5 0.1.3 Conditional probabilities and expectations . .6 0.2 Absolute continuity and equivalence . .7 0.3 Conditional expectation . .8 0.3.1 σ-fields . .8 0.3.2 Conditional expectation relative to a σ-field . 10 1 Discrete stochastic calculus 16 1.1 Stochastic processes . 16 1.2 Martingales . 18 1.3 Stochastic integral . 20 1.4 Intuitive summary of some notions from this chapter . 27 2 Modelling financial markets 36 2.1 Assets and trading strategies . 36 2.2 Arbitrage . 39 2.3 Dividend payments . 41 2.4 Concrete models . 43 3 Derivative pricing and hedging 48 3.1 Contingent claims . 49 3.2 Liquidly traded derivatives . 50 3.3 Individual perspective . 54 3.4 Examples . 56 3.4.1 Forward contract . 56 3.4.2 Futures contract . 57 3.4.3 European call and put options in the binomial model . 57 3.4.4 European call and put options in the standard model . 61 3.5 American options . 63 2 CONTENTS 3 4 Incomplete markets 70 4.1 Martingale modelling . 70 4.2 Variance-optimal hedging . 72 5 Portfolio optimization 73 5.1 Maximizing expected utility of terminal wealth . 73 6 Elements of continuous-time finance 80 6.1 Continuous-time theory . 80 6.2 Black-Scholes model . 84 Chapter 0 Some notions from stochastics 0.1 Basics 0.1.1 Probability spaces This course requires a decent background in stochastics which cannot be provided here.
    [Show full text]
  • 4 Martingales in Discrete-Time
    4 Martingales in Discrete-Time Suppose that (Ω, F, P) is a probability space. Definition 4.1. A sequence F = {Fn, n = 0, 1,...} is called a filtration if each Fn is a sub-σ-algebra of F, and Fn ⊆ Fn+1 for n = 0, 1, 2,.... In other words, a filtration is an increasing sequence of sub-σ-algebras of F. For nota- tional convenience, we define ∞ ! [ F∞ = σ Fn n=1 and note that F∞ ⊆ F. Definition 4.2. If F = {Fn, n = 0, 1,...} is a filtration, and X = {Xn, n = 0, 1,...} is a discrete-time stochastic process, then X is said to be adapted to F if Xn is Fn-measurable for each n. Note that by definition every process {Xn, n = 0, 1,...} is adapted to its natural filtration {Fn, n = 0, 1,...} where Fn = σ(X0,...,Xn). The intuition behind the idea of a filtration is as follows. At time n, all of the information about ω ∈ Ω that is known to us at time n is contained in Fn. As n increases, our knowledge of ω also increases (or, if you prefer, does not decrease) and this is captured in the fact that the filtration consists of an increasing sequence of sub-σ-algebras. Definition 4.3. A discrete-time stochastic process X = {Xn, n = 0, 1,...} is called a martingale (with respect to the filtration F = {Fn, n = 0, 1,...}) if for each n = 0, 1, 2,..., (a) X is adapted to F; that is, Xn is Fn-measurable, 1 (b) Xn is in L ; that is, E|Xn| < ∞, and (c) E(Xn+1|Fn) = Xn a.s.
    [Show full text]
  • Lecture 18 : Itō Calculus
    Lecture 18 : Itō Calculus 1. Ito's calculus In the previous lecture, we have observed that a sample Brownian path is nowhere differentiable with probability 1. In other words, the differentiation dBt dt does not exist. However, while studying Brownain motions, or when using Brownian motion as a model, the situation of estimating the difference of a function of the type f(Bt) over an infinitesimal time difference occurs quite frequently (suppose that f is a smooth function). To be more precise, we are considering a function f(t; Bt) which depends only on the second variable. Hence there exists an implicit dependence on time since the Brownian motion depends on time. dBt If the differentiation dt existed, then we can easily do this by using chain rule: dB df = t · f 0(B ) dt: dt t We already know that the formula above makes no sense. One possible way to work around this problem is to try to describe the difference df in terms of the difference dBt. In this case, the equation above becomes 0 (1.1) df = f (Bt)dBt: Our new formula at least makes sense, since there is no need to refer to the dBt differentiation dt which does not exist. The only problem is that it does not quite work. Consider the Taylor expansion of f to obtain (∆x)2 (∆x)3 f(x + ∆x) − f(x) = (∆x) · f 0(x) + f 00(x) + f 000(x) + ··· : 2 6 To deduce Equation (1.1) from this formula, we must be able to say that the significant term is the first term (∆x) · f 0(x) and all other terms are of smaller order of magnitude.
    [Show full text]
  • The Ito Integral
    Notes on the Itoˆ Calculus Steven P. Lalley May 15, 2012 1 Continuous-Time Processes: Progressive Measurability 1.1 Progressive Measurability Measurability issues are a bit like plumbing – a bit ugly, and most of the time you would prefer that it remains hidden from view, but sometimes it is unavoidable that you actually have to open it up and work on it. In the theory of continuous–time stochastic processes, measurability problems are usually more subtle than in discrete time, primarily because sets of measures 0 can add up to something significant when you put together uncountably many of them. In stochastic integration there is another issue: we must be sure that any stochastic processes Xt(!) that we try to integrate are jointly measurable in (t; !), so as to ensure that the integrals are themselves random variables (that is, measurable in !). Assume that (Ω; F;P ) is a fixed probability space, and that all random variables are F−measurable. A filtration is an increasing family F := fFtgt2J of sub-σ−algebras of F indexed by t 2 J, where J is an interval, usually J = [0; 1). Recall that a stochastic process fYtgt≥0 is said to be adapted to a filtration if for each t ≥ 0 the random variable Yt is Ft−measurable, and that a filtration is admissible for a Wiener process fWtgt≥0 if for each t > 0 the post-t increment process fWt+s − Wtgs≥0 is independent of Ft. Definition 1. A stochastic process fXtgt≥0 is said to be progressively measurable if for every T ≥ 0 it is, when viewed as a function X(t; !) on the product space ([0;T ])×Ω, measurable relative to the product σ−algebra B[0;T ] × FT .
    [Show full text]
  • MTH 664 Lectures 24 - 27
    MTH 664 0 MTH 664 Lectures 24 - 27 Yevgeniy Kovchegov Oregon State University MTH 664 1 Topics: • Martingales. • Filtration. Stopping times. • Probability harmonic functions. • Optional Stopping Theorem. • Martingale Convergence Theorem. MTH 664 2 Conditional expectation. Consider a probability space (Ω, F,P ) and a random variable X ∈ F. Let G ⊆ F be a smaller σ-algebra. Definition. Conditional expectation E[X|G] is a unique func- tion from Ω to R satisfying: 1. E[X|G] is G-measurable 2. R E[X|G] dP (ω) = R X dP (ω) for all A ∈ G A A The existence and uniqueness of E[X|G] comes from the Radon-Nikodym theorem. Lemma. If X ∈ G, Y (ω) ∈ L1(Ω,P ), and X(ω) · Y (ω) ∈ L1(Ω,P ), then E[X · Y |G] = X · E[Y |G] MTH 664 3 Conditional expectation. Consider a probability space (Ω, F,P ) and a random variable X ∈ F. Lemma. If G ⊆ F, then EE[X|G] = E[X] Let G1 ⊆ G2 ⊆ F be smaller sub-σ-algebras. Lemma. E E[X|G2] | G1 = E[X|G1] Proof. For any A ∈ G1 ⊆ G2, Z Z E E[X|G2] | G1 (ω) dP (ω) = E[X|G2](ω) dP (ω) A A Z Z = X(ω) dP (ω) = E[X|G1](ω) dP (ω) A A MTH 664 4 Filtration. Definition. Consider an arbitrary linear ordered set T :A sequence of sub-σ-algebras {Ft}t∈T of F is said to be a filtration if Fs ⊆ Ft a.s. ∀s < t ∈ T Example. Consider a sequence of random variables X1,X2,..
    [Show full text]
  • An Enlargement of Filtration Formula with Applications to Multiple Non
    Finance and Stochastics manuscript No. (will be inserted by the editor) An enlargement of filtration formula with applications to multiple non-ordered default times Monique Jeanblanc · Libo Li · Shiqi Song Received: date / Accepted: date Abstract In this work, for a reference filtration F, we develop a method for computing the semimartingale decomposition of F-martingales in a specific type of enlargement of F. As an application, we study the progressive en- largement of F with a sequence of non-ordered default times and we show how to deduce results concerning the first-to-default, k-th-to-default, k-out-of- n-to-default or the all-to-default events. In particular, using this method, we compute explicitly the semimartingale decomposition of F-martingales under the absolute continuity condition of Jacod. Keywords Enlargement of filtration · Non-ordered default times · Reduced form credit risk modeling · Semimartingale decomposition JEL classification G14 Mathematics Subject Classification (2010) 60G07 · 91G40 This research benefited from the support of the Chaire Risque de Cr´edit(F´ed´erationBan- caire Fran¸caise)and Labex ANR 11-LABX-0019. Monique Jeanblanc Univ Evry, LaMME, UMR CNRS 8071, Universit´eParis Saclay, 23 boulevard de France, 91037 Evry. E-mail: [email protected] Libo Li ( ) School of Mathematics and Statistics, University of New South Wales, NSW, Australia. E-mail: [email protected] Shiqi Song Univ Evry, LaMME, UMR CNRS 8071, Universit´eParis Saclay, 23 boulevard de France, 91037 Evry. E-mail: [email protected] 2 M. Jeanblanc, L. Li and S. Song 1 Introduction This paper is motivated by the needs to model multiple non-ordered default times in credit risk and to perform explicit computations in the pricing of credit instruments.
    [Show full text]
  • The Comparison Isomorphisms Ccris
    The Comparison Isomorphisms Ccris Fabrizio Andreatta July 9, 2009 Contents 1 Introduction 2 2 The crystalline conjecture with coe±cients 3 3 The complex case. The strategy in the p-adic setting 4 4 The p-adic case; Faltings's topology 5 5 Sheaves of periods 7 i 6 The computation of R v¤Bcris 13 7 The proof of the comparison isomorphism 16 1 1 Introduction Let p > 0 denote a prime integer and let k be a perfect ¯eld of characteristic p. Put OK := W (k) and let K be its fraction ¯eld. Denote by K a ¯xed algebraic closure of K and set GK := Gal(K=K). We write Bcris for the crystalline period ring de¯ned by J.-M. Fontaine in [Fo]. Recall that Bcris is a topological ring, endowed with a continuous action of GK , an ² exhaustive decreasing ¯ltration Fil Bcris and a Frobenius operator '. Following the paper [AI2], these notes aim at presenting a new proof of the so called crys- talline conjecture formulated by J.-M. Fontaine in [Fo]. Let X be a smooth proper scheme, geometrically irreducible, of relative dimension d over OK : Conjecture 1.1 ([Fo]). For every i ¸ 0 there is a canonical and functorial isomorphism com- muting with all the additional structures (namely, ¯ltrations, GK {actions and Frobenii) i et » i H (XK ; Qp) ­Qp Bcris = Hcris(Xk=OK ) ­OK Bcris: i » It follows from a classical theorem for crystalline cohomology, see [BO], that Hcris(Xk=OK ) = i HdR(X=OK ). The latter denotes the hypercohomology of the de Rham complex 0 ¡! OX ¡! ­1 ¡! ­2 ¡! ¢ ¢ ¢ ­d ¡! 0.
    [Show full text]
  • E1-Degeneration of the Irregular Hodge Filtration (With an Appendix by Morihiko Saito)
    E1-DEGENERATION OF THE IRREGULAR HODGE FILTRATION (WITH AN APPENDIX BY MORIHIKO SAITO) HÉLÈNE ESNAULT, CLAUDE SABBAH, AND JENG-DAW YU Abstract. For a regular function f on a smooth complex quasi-projective va- riety, J.-D. Yu introduced in [Yu14] a filtration (the irregular Hodge filtration) on the de Rham complex with twisted differential d+df, extending a definition of Deligne in the case of curves. In this article, we show the degeneration at E1 of the spectral sequence attached to the irregular Hodge filtration, by using the method of [Sab10]. We also make explicit the relation with a complex introduced by M. Kontsevich and give details on his proof of the correspond- ing E1-degeneration, by reduction to characteristic p, when the pole divisor of the function is reduced with normal crossings. In Appendix E, M. Saito gives a different proof of the latter statement with a possibly non reduced normal crossing pole divisor. Contents Introduction 2 1. The irregular Hodge filtration in the normal crossing case 3 1.1. Setup and notation 3 1.2. The irregular Hodge filtration 5 1.3. The Kontsevich complex 6 1.4. Comparison of the filtered twisted meromorphic de Rham complex and the filtered Kontsevich complex 7 1.5. Relation between Theorem 1.2.2 for α = 0 and Theorem 1.3.2 9 1.6. Deligne’s filtration: the D-module approach 11 1.7. Comparison of Yu’s filtration and Deligne’s filtration on the twisted de Rham complex 16 2. Preliminaries to a generalization of Deligne’s filtration 18 2.1.
    [Show full text]