
18.175: Lecture 16 Conditional expectation, random walks, martingales Scott Sheffield MIT 18.175 Lecture 16 Outline Conditional expectation Martingales Random walks Stopping times Arcsin law, other SRW stories 18.175 Lecture 16 Outline Conditional expectation Martingales Random walks Stopping times Arcsin law, other SRW stories 18.175 Lecture 16 I We require that Y is F measurable and that for all A in F, R R we have A XdP = A YdP. I Any Y satisfying these properties is called a version of E(X jF). I Is it possible that there exists more than one version of E(X jF) (which would mean that in some sense the conditional expectation is not canonically defined)? I Is there some sense in which E(X jF) always exists and is always uniquely defined (maybe up to set of measure zero)? Conditional expectation I Say we're given a probability space (Ω; F0; P) and a σ-field F ⊂ F0 and a random variable X measurable w.r.t. F0, with EjX j < 1. The conditional expectation of X given F is a new random variable, which we can denote by Y = E(X jF). 18.175 Lecture 16 I Any Y satisfying these properties is called a version of E(X jF). I Is it possible that there exists more than one version of E(X jF) (which would mean that in some sense the conditional expectation is not canonically defined)? I Is there some sense in which E(X jF) always exists and is always uniquely defined (maybe up to set of measure zero)? Conditional expectation I Say we're given a probability space (Ω; F0; P) and a σ-field F ⊂ F0 and a random variable X measurable w.r.t. F0, with EjX j < 1. The conditional expectation of X given F is a new random variable, which we can denote by Y = E(X jF). I We require that Y is F measurable and that for all A in F, R R we have A XdP = A YdP. 18.175 Lecture 16 I Is it possible that there exists more than one version of E(X jF) (which would mean that in some sense the conditional expectation is not canonically defined)? I Is there some sense in which E(X jF) always exists and is always uniquely defined (maybe up to set of measure zero)? Conditional expectation I Say we're given a probability space (Ω; F0; P) and a σ-field F ⊂ F0 and a random variable X measurable w.r.t. F0, with EjX j < 1. The conditional expectation of X given F is a new random variable, which we can denote by Y = E(X jF). I We require that Y is F measurable and that for all A in F, R R we have A XdP = A YdP. I Any Y satisfying these properties is called a version of E(X jF). 18.175 Lecture 16 I Is there some sense in which E(X jF) always exists and is always uniquely defined (maybe up to set of measure zero)? Conditional expectation I Say we're given a probability space (Ω; F0; P) and a σ-field F ⊂ F0 and a random variable X measurable w.r.t. F0, with EjX j < 1. The conditional expectation of X given F is a new random variable, which we can denote by Y = E(X jF). I We require that Y is F measurable and that for all A in F, R R we have A XdP = A YdP. I Any Y satisfying these properties is called a version of E(X jF). I Is it possible that there exists more than one version of E(X jF) (which would mean that in some sense the conditional expectation is not canonically defined)? 18.175 Lecture 16 Conditional expectation I Say we're given a probability space (Ω; F0; P) and a σ-field F ⊂ F0 and a random variable X measurable w.r.t. F0, with EjX j < 1. The conditional expectation of X given F is a new random variable, which we can denote by Y = E(X jF). I We require that Y is F measurable and that for all A in F, R R we have A XdP = A YdP. I Any Y satisfying these properties is called a version of E(X jF). I Is it possible that there exists more than one version of E(X jF) (which would mean that in some sense the conditional expectation is not canonically defined)? I Is there some sense in which E(X jF) always exists and is always uniquely defined (maybe up to set of measure zero)? 18.175 Lecture 16 I Proof: let A = fY > 0g 2 F and observe: R YdP = R XdP ≤ R jX jdP. By similar argument, RA A R A Ac −YdP ≤ Ac jX jdP. 0 I Uniqueness of Y : Suppose Y is F-measurable and satisfies R 0 R R A Y dP = A XdP = A YdP for all A 2 F. Then consider the set Y − Y 0 ≥ g. Integrating over that gives zero. Must hold for any . Conclude that Y = Y 0 almost everywhere. Conditional expectation I Claim: Assuming Y = E(X jF) as above, and EjX j < 1, we have EjY j ≤ EjX j. In particular, Y is integrable. 18.175 Lecture 16 0 I Uniqueness of Y : Suppose Y is F-measurable and satisfies R 0 R R A Y dP = A XdP = A YdP for all A 2 F. Then consider the set Y − Y 0 ≥ g. Integrating over that gives zero. Must hold for any . Conclude that Y = Y 0 almost everywhere. Conditional expectation I Claim: Assuming Y = E(X jF) as above, and EjX j < 1, we have EjY j ≤ EjX j. In particular, Y is integrable. I Proof: let A = fY > 0g 2 F and observe: R YdP = R XdP ≤ R jX jdP. By similar argument, RA A R A Ac −YdP ≤ Ac jX jdP. 18.175 Lecture 16 Conditional expectation I Claim: Assuming Y = E(X jF) as above, and EjX j < 1, we have EjY j ≤ EjX j. In particular, Y is integrable. I Proof: let A = fY > 0g 2 F and observe: R YdP = R XdP ≤ R jX jdP. By similar argument, RA A R A Ac −YdP ≤ Ac jX jdP. 0 I Uniqueness of Y : Suppose Y is F-measurable and satisfies R 0 R R A Y dP = A XdP = A YdP for all A 2 F. Then consider the set Y − Y 0 ≥ g. Integrating over that gives zero. Must hold for any . Conclude that Y = Y 0 almost everywhere. 18.175 Lecture 16 I Recall Radon-Nikodym theorem: If µ and ν are σ-finite measures on (Ω; F) and ν is absolutely continuous w.r.t. µ, then there exists a measurable f :Ω ! [0; 1) such that R ν(A) = A fdµ. I Observe: this theorem implies existence of conditional expectation. Radon-Nikodym theorem I Let µ and ν be σ-finite measures on (Ω; F). Say ν << µ (or ν is absolutely continuous w.r.t. µ if µ(A) = 0 implies ν(A) = 0. 18.175 Lecture 16 I Observe: this theorem implies existence of conditional expectation. Radon-Nikodym theorem I Let µ and ν be σ-finite measures on (Ω; F). Say ν << µ (or ν is absolutely continuous w.r.t. µ if µ(A) = 0 implies ν(A) = 0. I Recall Radon-Nikodym theorem: If µ and ν are σ-finite measures on (Ω; F) and ν is absolutely continuous w.r.t. µ, then there exists a measurable f :Ω ! [0; 1) such that R ν(A) = A fdµ. 18.175 Lecture 16 Radon-Nikodym theorem I Let µ and ν be σ-finite measures on (Ω; F). Say ν << µ (or ν is absolutely continuous w.r.t. µ if µ(A) = 0 implies ν(A) = 0. I Recall Radon-Nikodym theorem: If µ and ν are σ-finite measures on (Ω; F) and ν is absolutely continuous w.r.t. µ, then there exists a measurable f :Ω ! [0; 1) such that R ν(A) = A fdµ. I Observe: this theorem implies existence of conditional expectation. 18.175 Lecture 16 Outline Conditional expectation Martingales Random walks Stopping times Arcsin law, other SRW stories 18.175 Lecture 16 Outline Conditional expectation Martingales Random walks Stopping times Arcsin law, other SRW stories 18.175 Lecture 16 I Martingale convergence: A non-negative martingale almost surely has a limit. Two big results I Optional stopping theorem: Can't make money in expectation by timing sale of asset whose price is non-negative martingale. 18.175 Lecture 16 Two big results I Optional stopping theorem: Can't make money in expectation by timing sale of asset whose price is non-negative martingale. I Martingale convergence: A non-negative martingale almost surely has a limit. 18.175 Lecture 16 Outline Conditional expectation Martingales Random walks Stopping times Arcsin law, other SRW stories 18.175 Lecture 16 Outline Conditional expectation Martingales Random walks Stopping times Arcsin law, other SRW stories 18.175 Lecture 16 I Finite permutation of N is one-to-one map from N to itself that fixes all but finitely many points. I Event A 2 F is permutable if it is invariant under any finite permutation of the !i . I Let E be the σ-field of permutable events. I This is related to the tail σ-algebra we introduced earlier in the course.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages58 Page
-
File Size-