Stationary Stochastic Processes, Parts of Chapters 2 and 6

Stationary Stochastic Processes, Parts of Chapters 2 and 6

1 Stationary stochastic processes, parts of Chapters 2 and 6 Georg Lindgren, Holger Rootz´en, and Maria Sandsten Question marks indicate references to other parts of the book. Comments and plots regarding spectral densities are not supposed to be understood. 2 Chapter 1 Stationary processes 1.1 Introduction In Section 1.2, we introduce the moment functions: the mean value function, which is the expected process value as a function of time t, and the covariance function, which is the covariance between process values at times s and t. We remind of some simple rules for expectations and covariances, for example that covariances are linear in both arguments. We also give many example of how the mean value and covariance functions shall be interpreted. The main focus is on processes for which the statistical properties do not change with time – they are (statistically) stationary. Strict stationarity and weak statio- narity are defined. Dynamical systems, for example a linear system, is often described by a set of state variables, which summarize all important properties of the system at time t, and which change with time under influence of some environmental variables. Often the variables are random, and then they must be modeled as a stochastic process. State variables are further dealt with in Chapter ??. The statistical problem of how to find good models for a random phenomenon is also dealt with in this chapter, and in particular how one should estimate the mean value function and covariance function from data. The dependence between different process values needs to be taken care of when constructing confidence intervals and testing hypotheses. 1.2 Moment functions The statistical properties of a stochastic process X(t), t T are determined by the distribution functions. Expectation and standard{ deviation∈ } catch two important properties of the marginal distribution of X(t), and for a stochastic process these may be functions of time. To describe the time dynamics of the sample functions, we also need some simple measures of the dependence over time. The statistical definitions are simple, but the practical interpretation can be complicated. We illustrate this by the simple concepts of “average temperature” and “day-to-day” 3 4 Stationary processes Chapter 1 10 1992 5 0 −5 1997 −10 Average temperature −15 −20 −25 5 10 15 20 25 30 Day Figure 1.1: Daily average temperature in M˚alilla during January, for 1988 – 1997. The fat curves mark the years 1992 and 1997. correlation. Example 1.1. (’Daily temperature’) Figure 1.1 shows plots of the temperature in the small Swedish village of M˚alilla, averaged over each day, during the month of January for the ten years 1988 – 1997. Obviously, there has been large variations between years, and it has been rather cold for several days in a row. The global circulation is known to be a very chaotic systems, and it is hard to predict the weather more than a few days ahead. However, modern weather forecasts has adopted a statistical approach in the predictions, together with the computer intense numerical methods, which form the basis for all weather forecasts. Nature is regarded as a stochastic weather generator, where the distributions depend on geographical location, time of the year, etc, and with strong dependence from day to day. One can very well imagine that the data in the figure are the results of such a “weather roulette”, which for each year decides on the dominant weather systems, and on the day-to-day variation. With the statistical approach, we can think of the ten years of data as observations of a stochastic process X1,...,X31 . The mean value function is m(t) = E[Xt]. Since there is no theoretical reason to assume any particular values for the expected temperatures, one has to rely on historical data. In meteorology, the observed mean temperature during a 30 year period is often used as a standard. The covariance structure in the temperature series can also be analyzed from the data. Figure 1.2 illustrates the dependence between the temperatures from one day to the next. For each of the nine years 1988 – 1996 we show to the left scatter Section 1.2 Moment functions 5 20 20 20 20 20 20 10 10 10 10 10 10 0 0 0 0 0 0 −10 −10 −10 −10 −10 −10 −20 −20 −20 −20 −20 −20 −20 0 20 −20 0 20 −20 0 20 −20 0 20 −20 0 20 −20 0 20 20 20 20 20 20 20 10 10 10 10 10 10 0 0 0 0 0 0 −10 −10 −10 −10 −10 −10 −20 −20 −20 −20 −20 −20 −20 0 20 −20 0 20 −20 0 20 −20 0 20 −20 0 20 −20 0 20 20 20 20 20 20 20 10 10 10 10 10 10 0 0 0 0 0 0 −10 −10 −10 −10 −10 −10 −20 −20 −20 −20 −20 −20 −20 0 20 −20 0 20 −20 0 20 −20 0 20 −20 0 20 −20 0 20 Figure 1.2: Scatter plots of temperatures for years 1988 – 1996 for two successive days (left plot) and two days, five days apart (right plot). One can see a weak similarity between temperatures for adjacent days, but it is hard to see any connection with five days separation. plots of the pairs (Xt,Xt+1), i.e., with temperature one day on the horizontal axis and the temperature the next day on the vertical axis. There seems to be a weak dependence, two successive days are correlated. To the right we have similar scatter plots, but now with five days separation, i.e., data are (Xt,Xt+5). There is almost no correlation between two days that are five days apart. 2 1.2.1 Definitions We now introduce the basic statistical measures of average and correlation. Let X(t), t T be a real valued stochastic process with discrete or continuous time. { ∈ } Definition 1.1 For any stochastic process, the first and second order moment func- tions are defined as m(t) = E[X(t)] mean value function (mvf) v(t) = V[X(t)] variancefunction (vf) r(s, t) = C[X(s),X(t)] covariance function (cvf) b(s, t) = E[X(s)X(t)] second-moment function ρ(s, t) = ρ[X(s),X(t)] correlation function There are some simple relations between these functions: r(t, t)= C[X(t),X(t)] = V[X(t)] = v(t), r(s, t)= b(s, t) m(s)m(t), − C[X(s),X(t)] r(s, t) ρ(s, t)= = . V[X(s)] V[X(t)] r(s,s) r(t, t) p p 6 Stationary processes Chapter 1 These functions provide essential information about the process. The meaning of the mean value and variance functions are intuitively clear and easy to understand. For example, the mean value function describes how the expected value changes with time, like we expect colder weather during winter months than during summer. The (square root of the) variance function tells us what magnitude of fluctuations we can expect. The covariance function has no such immediate interpretation, even if its statistical meaning is clear enough as a covariance. For example, in the ocean wave example, Example ??, the covariance r(s,s + 5) is negative and r(s,s + 10) is positive, corresponding to the fact that measurements five seconds apart often fall on the opposite side of the mean level, while measurements at ten seconds distance often are on the same side. Simply stated, it is the similarity between observations as a function of the times of measurements. If there are more than one stochastic process in a study, one can distinguish the moment functions by indexing them, as mX ,rX , etc. A complete name for the covariance function is then auto-covariance function, to distinguish it from a cross- covariance function. In Chapter ??, we will investigate this measure of co-variation between two stochastic processes. Definition 1.2 The function r (s, t)= C[X(s),Y (t)] = E[X(s)Y (t)] m (s)m (t), X,Y − X Y is called the cross-covariance function between X(t), t T and Y (t), t T . { ∈ } { ∈ } 1.2.2 Simple properties and rules The first and second order moment functions are linear and bi-linear, respectively. We formulate the following generalization of the rules E[aX + bY ]= aE[X]+ bE[Y ], V[aX + bY ] = a2V(X)+ b2V[Y ], which hold for uncorrelated random variables, X and Y . Theorem 1.1. Let a1,...,ak and b1,...,bl be real constants, and let X1,...,Xk and Y1,...,Yl be random variables in the same experiment, i.e., defined on a com- mon sample space. Then k k E aiXi = aiE[Xi], " i=1 # i=1 X X k k k V aiXi = aiajC[Xi,Xj], " i=1 # i=1 j=1 X X X k l k l C aiXi, bjYj = aibjC[Xi,Yj]. " i=1 j=1 # i=1 j=1 X X X X The rule for the covariance between sums of random variables, C[ aiXi, bjYj] is easy to remember and use: the total covariance between two sums is a double sum P P Section 1.2 Moment functions 7 of all covariances between pairs of one term from the first sum, and one term from the second sum. Remember, that if two independent random variables X and Y are always uncor- related, i.e., C[X,Y ] = 0, the reverse does not necessarily hold; even if C[X,Y ] = 0, there can be a strong dependence between X and Y .

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    49 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us