<<

5: FILTRATIONS AND CONDITIONING

Marek Rutkowski School of and Statistics University of Sydney

Semester 2, 2016

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 1 / 37 New Features

In the case of multi-period market models, we need to deal with two important new features. First, the investors can trade in assets at any specific date t ∈ {0, 1, 2,...,T } where T is the horizon date. Second, the investors can gather information over time, since the fluctuations of asset prices can be observed. We will need to introduce the concept of a self-financing trading strategy. We have to determine how the level of information available to investors evolves over time. The latter aspect leads to the probabilistic concepts of σ-fields and filtrations.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 2 / 37 Outline

We will examine the following issues: 1 Partitions and σ-fields. 2 Filtrations and adapted stochastic processes. 3 Conditional expectation with respect to a partition or a σ-field. 4 Change of a and the Radon-Nikodym density.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 3 / 37 PART 1

PARTITIONS AND σ-FIELDS

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 4 / 37 σ-Field

The concept of a σ-field can be used to describe the amount of information available at a given moment. Let N = {1, 2,... } be the of all natural numbers.

Definition (σ-Field) A collection F of subsets of Ω is called a σ-field (or a σ-algebra) whenever: 1 Ω ∈ F, c 2 if A ∈ F then A := Ω \ A ∈ F, ∞ 3 if Ai ∈ F for all i ∈ N then i=1 Ai ∈ F.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 5 / 37 Interpretation of a σ-Field

The set of information has to contain all possible states, so that we postulate that Ω belongs to any σ-field. Any set A ∈ F is interpreted as an observed event. If an event A ∈ F is given, that is, some collection of states is given, then the remaining states can also be identified and thus the complement Ac is also an event. The idea of a σ-field is to model a certain level of information. In particular, as the σ-field becomes larger, more and more events can be identified. We will later introduce a concept of an increasing flow of information, formally represented by an ordered (increasing) family of σ-fields.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 6 / 37 Probability Measure

Definition (Probability Measure) A map P : F → [0, 1] is called a probability measure if: 1 P (Ω) = 1,

2 for any Ai, i ∈ N of pairwise disjoint events we have

P Ai = P (Ai) . i∈N i∈N

The triplet (Ω, F, P) is called a .

By convention, the probability of all possibilities is 1 (see 1). Probability should satisfy σ-additivity (see 2) Note that P (∅) = 0 and for an arbitrary event A ∈ F we have P (Ac) = 1 − P (A).

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 7 / 37 Example: σ-Fields

Example (5.1)

We take Ω= {ω1,ω2,ω3,ω4} and we define the σ-fields:

F1 = {∅, Ω}

F2 = ∅, Ω, {ω1,ω2} , {ω3} , {ω4} , {ω1,ω2,ω3} , {ω1,ω2,ω4} , {ω3,ω4} Ω F3 = 2 (the class of all subsets of Ω).

Note that F1 ⊂F2 ⊂F3, that is, the information increases:

F1: no information, except for the set Ω.

F2: partial information, since we cannot distinguish between the occurrence of either ω1 or ω2.

F3: full information, since {ω1}, {ω2}, {ω3} and {ω4} can be observed.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 8 / 37 Example: Probability Measure

Example (5.1 Continued)

We define the probability measure P on the σ-field F2 2 1 1 P ({ω1,ω2})= , P ({ω3})= , P ({ω4})= . 3 6 6

The σ-additivity of P leads to

P ({ω1,ω2}∪{ω3}∪{ω4})=1= P (Ω) .

Ω Note that P is not yet defined on the σ-field F3 =2 and in fact the extension of P from F2 to F3 is not unique. For any α ∈ [0, 2/3] we may set 2 P ({ω1})= α = − P ({ω2}) . α 3 α

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 9 / 37 Partitions

Definition

Let I be some index set. Assume that we are given a collection (Bi)i∈I of subsets of Ω. Then the smallest σ-field containing this collection is denoted by σ ((Bi)i∈I ) and is called the σ-field generated by (Bi)i∈I .

Definition (Partition)

By a partition of Ω, we mean any collection P = (Ai)i∈I of non-empty subsets of Ω such that the sets Ai are pairwise disjoint, that is, Ai ∩ Aj = ∅ whenever i = j and i∈I Ai =Ω. Lemma

A partition P = (Ai)i∈I generates a σ-field F if every set A ∈ F can be represented as follows: A = j∈J Aj for some subset J ⊂ I.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 10 / 37 Partition Associated with a σ-Field

Definition (Partition Associated with F) A partition of Ω associated with a σ-field F is a collection of non-empty sets Ai ∈ F for some i ∈ I such that

1 Ω= i∈I Ai. 2 The sets Ai are pairwise disjoint, i.e., Ai ∩ Aj = ∅ for i = j.

3 For each A ∈ F there exists J ⊆ I such that A = i∈J Ai. Lemma For any σ-field F of subsets of a finite state space Ω, a partition associated with this σ-field always exists and is unique.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 11 / 37 Partition Associated with a σ-Field

Further properties of partitions: If Ω is countable then for any σ-field F there exists a unique partition P of Ω associated with F. It is also clear that this partition generates F, so that F = σ(P).

The sets Ai in a partition must be smallest, specifically, if F = σ(P) and A ∈ F is such that A ⊆ Ai then A = Ai.

The probability of any A ∈ F equals the sum of of Ais in the partition generating F, specifically,

A = Ai ⇒ P (A)= P (Ai) . i∈J i∈J

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 12 / 37 Example: Partition Associated with a σ-Field

Example (5.2)

Consider the σ-field F2 introduced in Example 5.1.

The unique partition associated with F2 is given by

P2 = {ω1,ω2} , {ω3} , {ω4} . Define the probabilities 2 1 1 P ({ω ,ω })= , P ({ω })= , P ({ω })= . 1 2 3 3 6 4 6

Then for each event A ∈ F2 the probability of A can be easily evaluated, for instance 5 P ({ω ,ω ,ω })= P ({ω ,ω })+ P ({ω })= . 1 2 4 1 2 4 6

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 13 / 37 Random Variable

Let F be an arbitrary σ-field of subsets of Ω. In the next definition, we do not assume that the sample space is discrete.

Definition (F-Measurability) A map X :Ω → R is said to be F-measurable if for every closed interval [a, b] ⊂ R the preimage (i.e. the inverse image) under X belongs to F, that is, X−1([a, b]) := {ω ∈ Ω | X (ω) ∈ [a, b]} ∈ F. Equivalently, for any real number x

X−1((−∞,x]) := {ω ∈ Ω | X (ω) ≤ x} ∈ F.

If X is F-measurable then X is called a random variable on (Ω, F).

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 14 / 37 Partition and Random Variable

Proposition (5.1) Let X :Ω → R be an arbitrary function. Let F be a σ-field and P = (Ai)i∈I be the unique partition associated with F. Then X is F-measurable if and only if for each Ai ∈ P there exists a constant ci such that X maps Ai to {ci}, that is,

X (ω)= ci for all ω ∈ Fi.

Proof of Proposition 5.1 (⇒).

(⇒) Assume that (Ai)i∈I is a partition of the σ-field F and that X is F-measurable. Let j ∈ I be an index and let an element ω ∈ Aj be arbitrary. Define cj := X(ω). We wish to show that X(ω)= cj for all −1 ω ∈ Aj. Since X is F-measurable, X (cj) ∈ F.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 15 / 37 Proof of Proposition 5.1

Proof of Proposition 5.1 (⇒). By properties 2. and 3. in the definition of a σ-field, we have

−1 ∅= X (cj) ∩ Aj ∈ F.

It is obvious that the inclusion

−1 X (cj) ∩ Aj ⊂ Aj

holds. Therefore, from the aforementioned minimality property of the sets contained in the partition, we obtain

−1 X (cj ) ∩ Aj = Aj.

But this means that X(ω)= cj for all ω ∈ Aj and thus X is constant on Aj. By varying j, we obtain such cjs for all j ∈ I.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 16 / 37 Proof of Proposition 5.1

Proof of Proposition 5.1 (⇐). (⇐) Assume now that X :Ω → R is a function, which is constant on all sets Aj belonging to the partition and that the cj are given as in the statement of the proposition. Let [a, b] be a closed interval in R. We define

C := cj | j ∈ I and cj ∈ [a, b] . Since i∈I Ai =Ω no other elements then the cj occur as values of X. Therefore,

−1 −1 −1 X ([a, b]) = X (C)= X (cj)= Aj ∈ F

j |cj ∈C j |cj ∈C

where the last equality holds by property 3. of a σ-field.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 17 / 37 PART 2

FILTRATIONS AND ADAPTED STOCHASTIC PROCESSES

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 18 / 37 Information Flow: Filtration

In a typical application, the information about random events increases over time. To model the information flow, we use the concept of a filtration. The following definition covers the cases of the discrete and continuous time.

Definition (Filtration)

A family (Ft)0≤t≤T of σ-fields on Ω is called a filtration if Fs ⊂ Ft whenever s ≤ t. For brevity, we denote F = (Ft)0≤t≤T .

We interpret the σ-field Ft as the information available to an agent at time t. In particular, F0 represents the information available at time 0, that is, the initial information. We assume that the information accumulated over time can only grow, so that we never forget anything!

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 19 / 37

Definition (Stochastic Process)

A family X = (Xt)0≤t≤T of random variables is called a stochastic process. A stochastic process X is said to be F-adapted if for every t = 0, 1,...,T the random variable Xt is Ft-measurable.

Example (5.3) Consider once again the elementary market model. The stochastic process of the stock price is S0 and S1 on Ω= {ω1,ω2} and the filtration is F0 = {∅, Ω} and Ω F1 = ∅, Ω, {ω1}, {ω2} = 2 . Note that F0 is the initial information, which means that the investor knows only all possible states.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 20 / 37 Filtration Generated by a Stochastic Process

The initial information at time 0 is usually given by the trivial σ-field F0 = {∅, Ω} since all prices are known at 0, so that there is no uncertainty.

Let X = (Xt)0≤t≤T be a stochastic process on the probability space (Ω, F, P). X For a fixed t, we define the σ-field Ft by setting

X −1 Ft = σ Xs ([a, b]) 0 ≤ s ≤ t, a ≤ b . X X Then the filtration F = (Ft )0≤t≤T is called the filtration generated by the process X.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 21 / 37 Example 5.4: Stock Price Model

S2 = 1.8 ω1 P(ω1) = 3/16 s9 sss sss sss S1 C = 1.3 ¦C KKK ¦¦ KKK ¦¦ KKK ¦¦ K% ¦ S2 = 1.2 ω2 P(ω2) = 5/16 ¦¦ ¦¦ ¦¦ ¦¦ S0 = 1 88 88 88 88 8 S2 = 1.1 ω3 P(ω3) = 4/16 88 9 8 sss 88 ss 8 sss 8 ss S1 = 0.7 KKK KKK KKK K% S2 = 0.5 ω4 P(ω4) = 4/16

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 22 / 37 Example: Stock Price Model

Example (5.4 Continued) S F0 = {∅, Ω} since S0 is deterministic. At t = 1, we have

Ω if a ≤ 0.7 and b ≥ 1.3 −1  {ω1,ω2} if 0.7 < a ≤ 1.3 and b ≥ 1.3 S1 ([a, b]) =   {ω3,ω4} if a ≤ 0.7 and 0.7 ≤ b< 1.3 ∅ otherwise   and thus S F1 = ∅, Ω, {ω1,ω2} , {ω3,ω4} . S Ω F2 = 2 since the partition generating F2 is

P2 = {ω1}, {ω2}, {ω3}, {ω4} . M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 23 / 37 PART 3

CONDITIONAL EXPECTATION

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 24 / 37 Conditional Expectation

Let (Ω, F, P) be a finite (or countable) probability space. Let X be an arbitrary F-measurable random variable. Assume that G is a σ-field which is contained in F.

Let (Ai)i∈I be the unique partition associated with G.

Our next goal is to define the conditional expectation EP(X| G), that is, the conditional expectation of a random variable X with respect to a σ-field G.

The expected value EP(X) will be obtained from EP(X| G) by setting G = F0, that is, EP(X)= EP(X| F0).

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 25 / 37 Conditional Expectation

Definition (Conditional Expectation)

The conditional expectation EP(X| G) of X with respect to G is defined as the random variable which satisfies, for every ω ∈ Ai, 1 EP(X| G)(ω)= X(ωl)P(ωl)= xk P(X = xk |Ai) P(Ai) ωl∈Ai xk where the summation is over all possible values of X and

−1 P(X = xk | Ai) = (P(Ai)) P({X = xk}∩ Ai)

is the conditional probability of the event {ω ∈ Ω | X(ω)= xk} given Ai. Hence 1 1 1 EP(X| G)= EP(X Ai ) Ai . P(Ai) i∈I

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 26 / 37 Properties of Conditional Expectation

EP(X| G) is well defined by equation and, by Proposition 5.1, the conditional expectation EP(X| G) is a G-measurable r.v.

EP(X| G) is the best estimate of X given the information represented by the σ-field G. The following identity uniquely characterises the conditional expectation (in addition to G-measurability):

X(ω)P(ω)= EP(X| G)(ω)P(ω), ∀ G ∈ G. ω∈G ω∈G

One can represent this equality using (discrete) integrals: for every G ∈ G,

X dP = EP(X| G) dP, ∀ G ∈ G. G G

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 27 / 37 Properties of Conditional Expectation

Proposition (5.2)

Let (Ω, F, P) be endowed with sub-σ-fields G and G1 ⊂ G2 of F. Then 1 Tower property: If X :Ω → R is an F-measurable r.v. then

EP(X| G1)= EP EP(X| G2) | G1 = EP EP(X| G1) | G2 . 2 Taking out what is known: If X :Ω → R is a G-measurable r.v. and Y :Ω → R is an F-measurable r.v. then

EP(XY | G)= X EP(Y | G).

3 Trivial conditioning: If X :Ω → R is an F-measurable r.v. independent of G then

EP(X| G)= EP(X).

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 28 / 37 Example 5.5: Conditional Expectation

S: 2 = 9 ω1 P(ω1) = 6/25 3 u: 5 uu uu uu uu S1C = 8 C II 2 ¨ I 5 ¨ II ¨¨ II 2 ¨ II 5 ¨ $ ¨ S2 = 6 ω2 P(ω2) = 4/25 ¨¨ ¨¨ ¨¨ ¨¨ S0 =6 5 66 66 6 3 66 5 6 S: 2 = 6 ω3 P(ω3) = 6/25 6 2 u: 66 5 uu 6 uu 6 uu 6 uu S1 = 4 II 3 II 5 II II I$ S2 = 3 ω4 P(ω4) = 9/25

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 29 / 37 Example: Conditional Expectation

Example (5.5)

The underlying probability space is given by Ω= {ω1,ω2,ω3,ω4}.

At time t = 0, the stock price is known and unique value is S0 = 5. S Hence the σ-field F0 is the trivial σ-field. At time t = 1, the stock can take two possible values and

Ω if a ≤ 4 and 8 ≤ b −1  {ω1,ω2} if 4 < a and 8 ≤ b S1 ([a, b]) =   {ω3,ω4} if a ≤ 4 and b< 8 ∅ if a< 4 and b< 4  S  so that F1 = {∅, {ω1,ω2}, {ω3,ω4}, Ω}. S Ω At time t = 2, we have F1 = 2 .

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 30 / 37 Example: Conditional Expectation

Example (5.5 Continued)

6 P({ω ∈ A1} ∩ {S2(ω) = 9}) P(ω1) 25 3 P(S2 = 9 | A1)= = = = P(A ) P({ω ,ω }) 2 5 1 1 2 5 4 P({ω ∈ A1} ∩ {S2(ω) = 6}) P(ω2) 25 2 P(S2 = 6 | A1)= = = = P A P ω ,ω 2 ( 1) ({ 1 2}) 5 5 P({ω ∈ A1} ∩ {S2(ω) = 3}) P(∅) P(S2 = 3 | A1)= = = 0 P(A1) P({ω1,ω2}) P({ω ∈ A2} ∩ {S2(ω) = 9}) P(∅) P(S2 = 9 | A2)= = = 0 P(A2) P({ω3,ω4}) 6 P({ω ∈ A2} ∩ {S2(ω) = 6}) P(ω3) 25 2 P(S2 = 6 | A2)= = = = P A P ω ,ω 3 ( 2) ({ 3 4}) 5 5 9 P({ω ∈ A2} ∩ {S2(ω) = 3}) P(ω4) 25 3 P(S2 = 3 | A2)= = = = P A P ω ,ω 3 ( 2) ({ 3 4}) 5 5

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 31 / 37 Example: Conditional Expectation

Example (5.5 Continued) We have

S 3 2 39 EP(S2 |F1 )(ω)=9 +6 +3 0= for ω ∈ A1 5 5 5 S 2 3 21 EP(S2 |F1 )(ω)=9 0+6 +3 = for ω ∈ A2 5 5 5 and thus S 39/5 if ω ∈{ω1,ω2} EP(S2|F1 )= 21/5 if ω ∈{ω3,ω4} Note that

S 39 2 21 3 141 EP EP(S2|F1 ) = + = 5 5 5 5 25 6 4 6 9 141 EP(S2)=9 +6 +6 +3 = 25 25 25 25 25

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 32 / 37 PART 4

CHANGE OF A PROBABILITY MEASURE

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 33 / 37 Change of a Probability Measure

Let P and Q be equivalent probability measures on (Ω, F). Let the Radon-Nikodym density of Q with respect to P be dQ (ω)= L(ω), P-a.s. dP meaning that L is F-measurable and, for every A ∈ F,

X dQ = XL dP. A A If Ω is finite then this equality becomes

X(ω) Q(ω)= X(ω)L(ω) P(ω). ω∈A ω∈A

The r.v. L is strictly positive P-a.s. and EP(L) = 1. Equality EQ(X)= EP(XL) holds for any Q-integrable random variable X (it suffices to take A =Ω).

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 34 / 37 Abstract Bayes Formula

Lemma (5.1: Bayes Formula) Let G be a sub-σ-field of F and let X be a Q-integrable random variable. Then the Bayes formula holds

EP(XL | G) EQ(X | G)= . EP(L | G)

Proof.

EP(L | G) is strictly positive so that the RHS is well defined. By our assumption, the random variable XL is P-integrable. Therefore, it suffices to show that the equality

EP(XL | G)= EQ(X | G) EP(L | G)

holds for every random variable X.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 35 / 37 Proof of Lemma 5.1 for General σ-fields (MATH3975)

Proof. Since the RHS of the last formula defines a G-measurable random variable, it suffices to verify that for all sets G ∈ G, we have

XL dP = EQ(X | G)EP(L | G) dP. G G For every G ∈ G, we obtain

L−density Q−cond. XL dP = X dQ = EQ(X | G) dQ G G G L−density P−cond = EQ(X | G)L dP = EP EQ(X | G)L G dP G G Prop.5.2 = EQ(X | G) EP(L | G) dP. G This ends the proof of Lemma 5.1 for the general case.

M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 36 / 37 Proof of Lemma 5.1 for Partitions (MATH3075)

Proof.

Now assume that G = σ(A1,...,Am) and Q(ωl)= L(ωl)P(ωl). Then on every event Ai 1 EQ(X| G)= X(ωl)Q(ωl) Q(Ai) ωl∈Ai 1 1 EP(LX| G)= L(ωl)X(ωl)P(ωl)= X(ωl)Q(ωl) P(Ai) P(Ai) ωl∈Ai ωl∈Ai

1 Q(Ai) EP(L| G)= L(ωl)P(ωl)= . P(Ai) P(Ai) ωl∈Ai

Hence on every event Ai from the partition P = {A1,...,Am}

EP(XL | G) EQ(X | G)= . EP(L | G)

This ends the proof of Lemma 5.1 when G = σ(A1,...,Am). M. Rutkowski (USydney) Slides 5: Filtrations and Conditioning 37 / 37