Stochastic) Process {Xt, T ∈ T } Is a Collection of Random Variables on the Same Probability Space (Ω, F,P

Stochastic) Process {Xt, T ∈ T } Is a Collection of Random Variables on the Same Probability Space (Ω, F,P

Random Processes Florian Herzog 2013 Random Process Definition 1. Random process A random (stochastic) process fXt; t 2 T g is a collection of random variables on the same probability space (Ω; F;P ). The index set T is usually represen- ting time and can be either an interval [t1; t2] or a discrete set. Therefore, the stochastic process X can be written as a function: X : R × Ω ! R; (t; !) 7! X(t; !) Stochastic Systems, 2013 2 Filtration Observations: • The amount of information increases with time. • We use the concept of sigma algebras. • We assume that information is not lost with increasing time • Therefore the corresponding σ-algebras will increase over time when there is more information available. • ) This concept is called filtration. Definition 2. Filtration/adapted process A collection fFtgt≥0 of sub σ-algebras is called filtration if, for every s ≤ t, we have Fs ⊆ Ft. The random variables fXt : 0 ≤ t ≤ 1g are called adapted to the filtration Ft if, for every t, Xt is measurable with respect to Ft. Stochastic Systems, 2013 3 Filtration Example 1. Suppose we have a sample space of four elements: Ω = f!1;!2;!3;!4g. At time zero, we don't have any information about which T f g ! has been chosen. At time 2 we know wether we will have !1;!2 or f!3;!4g. r D = f!1g rB r E = f!2g r A r r F = f!3g C r G = f!4g - t T 0 2 T Abbildung 1: Example of a filtration Stochastic Systems, 2013 4 Difference between Random process and smooth functions Let x(·) be a real, continuously differentiable function defined on the interval [0;T ]. Its continuous differentiability implies both a bounded total variation and a vanishing \sum of squared increments": 1. Total variation: Z T dx(t) dt < 1 0 dt 2. \Sum of squares": N ( ( ) ( )) X T T 2 lim x k − x (k−1) = 0 N!1 N N k=1 Random processes do not have either of these nice smoothness properties in general. This allows the desired \wild" and \random" behavior of the (sample) \noise signals". Stochastic Systems, 2013 5 Markov Process A Markov process has the property that only the instantaneous value X(t) is relevant for the future evolution of X. Past and future of a Markov process have no direct relation. Definition 3. Markov process A continuous time stochastic process X(t); t 2 T; is called a Markov process if for any finite parameter set fti : ti < ti+1g 2 T we have P (X(tn+1) 2 BjX(t1);:::;X(tn)) = P (X(tn+1) 2 BjX(tn)) Stochastic Systems, 2013 6 Transition probability Definition 1. Let X(t) be a Markov process. The function P (s; X(s); t; B) is the conditional probability P (X(t) 2 B j X(s)) called transition proba- bility or transition function. This means it is the probability that the process X(t) will be found inside the area B at time t, if at time s < t it was observed at state X(s). Stochastic Systems, 2013 7 Transition probability Geometric Brownian motion with µ=0.08 σ=0.2 50 45 40 35 30 B 25 x(t) 20 15 10 X s 5 t s s+t 0 0 5 10 15 20 25 30 35 40 45 50 t Abbildung 2: Transition probability P (s; x; t + s; B). Stochastic Systems, 2013 8 Gaussian Process • A stochastic process is called Gaussian if all its joint probability distributions are Gaussian. • If X(t) is a Gaussian process, then X(t) ∼ N (µ(t); σ2(t)) for all t. • A Gaussian process is fully characterized by its mean and covariance function. • Performing linear operations on a Gaussian process still results in a Gaussian process. • Derivatives and integrals of Gaussian processes are Gaussian processes themselves (note: stochastic integration and differentiation). Stochastic Systems, 2013 9 Martingale Process A stochastic process X(t) is a martingale relative to (fFtgt≥0;P ) if the following conditions hold: • X(t) is fFtgt≥0-adapted, E[jX(t)j] < 1 for all t ≥ 0. • E[X(t)jFs] = X(s) a.s. (0 ≤ s ≤ t). • The best prediction of a martingale process is its current value. • Fair game model Stochastic Systems, 2013 10 Diffusions A diffusion is a Markov process with continuous trajectories such that for each time t and state X(t) the following limits exist 1 µ(t; X(t)) := lim E[X(t + ∆t) − X(t)jX(t)]; ∆t#0 ∆t 1 σ2(t; X(t)) := lim E[fX(t + ∆t) − X(t)g2jX(t)]: ∆t#0 ∆t For these limits, µ(t; X(t)) is called drift and σ2(t; X(t)) is called the diffusion coefficient. Stochastic Systems, 2013 11.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    11 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us