
This is page 57 Printer: Opaque this 3 Time Series Concepts 3.1 Introduction This chapter provides background material on time series concepts that are used throughout the book. These concepts are presented in an informal way, and extensive examples using S-PLUS are used to build intuition. Sec- tion 3.2 discusses time series concepts for stationary and ergodic univariate time series. Topics include testing for white noise, linear and autoregressive moving average (ARMA) process, estimation and forecasting from ARMA models, and long-run variance estimation. Section 3.3 introduces univariate nonstationary time series and defines the important concepts of I(0) and I(1) time series. Section 3.4 explains univariate long memory time series. Section 3.5 covers concepts for stationary and ergodic multivariate time series, introduces the class of vector autoregression models, and discusses long-run variance estimation. Rigorous treatments of the time series concepts presented in this chapter can be found in Fuller (1996) and Hamilton (1994). Applications of these concepts to financial time series are provided by Campbell, Lo and MacKin- lay (1997), Mills (1999), Gourieroux and Jasiak (2001), Tsay (2001), Alexan- der (2001) and Chan (2002). 58 3. Time Series Concepts 3.2 Univariate Time Series 3.2.1 Stationary and Ergodic Time Series Let yt = ...yt 1,yt,yt+1,... denote a sequence of random variables indexed{ } by some{ time− subscript t.} Call such a sequence of random variables a time series. Thetimeseries yt is covariance stationary if { } E[yt]=µ for all t cov(yt,yt j)=E[(yt µ)(yt j µ)] = γj for all t and any j − − − − For brevity, call a covariance stationary time series simply a stationary time series. Stationary time series have time invariant first and second th moments. The parameter γj is called the j order or lag j autocovariance of yt and a plot of γ against j is called the autocovariance function.The { } j autocorrelations of yt are defined by { } cov(yt,yt j) γj ρj = − = var(yt)var(yt j) γ0 − p and a plot of ρj against j is called the autocorrelation function (ACF). Intuitively, a stationary time series is defined by its mean, variance and ACF. A useful result is that any function of a stationary time series is also a stationary time series. So if yt is stationary then zt = g(yt) is stationary for any function g( ).{ } { } { } The lag j sample autocovariance· and lag j sample autocorrelation are defined as 1 T γˆj = (yt y¯)(yt j y¯) (3.1) T − − − t=j+1 X γˆj ρˆj = (3.2) γˆ0 1 T wherey ¯ = T t=1 yt isthesamplemean.ThesampleACF(SACF)isa plot ofρ ˆ against j. j P A stationary time series yt is ergodic if sample moments converge in { } p p p probability to population moments; i.e. ify ¯ µ, γˆ γ andρ ˆ ρ . → j → j j → j Example 1 Gaussian white noise (GWN) processes Perhaps the most simple stationary time series is the independent Gaus- 2 2 sian white noise process yt iid N(0,σ ) GW N(0,σ ). This process ∼ ≡ has µ = γj = ρj =0(j =0).TosimulateaGW N(0, 1) process in S-PLUS use the rnorm function:6 3.2 Univariate Time Series 59 Series : y y ACF -2 -1 0 1 2 -0.2 0.0 0.2 0.4 0.6 0.8 1.0 0 20 40 60 80 100 0246810 Lag FIGURE 3.1. Simulated GaussianwhitenoiseprocessandSACF. > set.seed(101) > y = rnorm(100,sd=1) To compute the sample momentsy ¯,ˆγj,ˆρj (j =1,...,10) and plot the data and SACF use > y.bar = mean(y) > g.hat = acf(y,lag.max=10,type="covariance",plot=F) > r.hat = acf(y,lag.max=10,type="correlation",plot=F) > par(mfrow=c(1,2)) > tsplot(y,ylab="y") > acf.plot(r.hat) By default, as shown in Figure 3.1, the SACF is shown with 95% con- fidence limits about zero. These limits are based on the result (c.f. Fuller 2 (1996) pg. 336) that if yt iid (0,σ )then { } ∼ 1 ρˆ A N 0, ,j>0. j ∼ T µ ¶ The notationρ ˆ A N 0, 1 means that the distribution ofρ ˆ is approxi- j ∼ T j mated by normal distribution with mean 0 and variance 1 and is based on ¡ ¢ T the central limit theorem result √T ρˆ d N (0, 1). The 95% limits about j → zero are then 1.96 . ± √T 60 3. Time Series Concepts y -2 -1 0 1 2 -2 -1 0 1 2 Quantiles of Standard Normal FIGURE 3.2. Normal qq-plot for simulated GWN. Two slightly more general processes are the independent white noise 2 (IWN) process, yt IWN(0,σ ), and the white noise (WN) process, 2 ∼ 2 yt WN(0,σ ). Both processes have mean zero and variance σ ,but the∼ IWN process has independent increments, whereas the WN process has uncorrelated increments. Testing for Normality In the previous example, yt GWN(0, 1). There are several statistical ∼ methods that can be used to see if an iid process yt is Gaussian. The most common is the normal quantile-quantile plot or qq-plot, a scatterplot of the standardized empirical quantiles of yt against the quantiles of a standard normal random variable. If yt is normally distributed, then the quantiles will lie on a 45 degree line. A normal qq-plot with 45 degree line for yt may be computed using the S-PLUS functions qqnorm and qqline > qqnorm(y) > qqline(y) Figure 3.2 shows the qq-plot for the simulated GWN data of the previous example. The quantiles lie roughly on a straight line. The S+FinMetrics function qqPlot may be used to create a Trellis graphics qq-plot. The qq-plot is an informal graphical diagnostic. Two popular formal statistical tests for normality are the Shapiro-Wilks test and the Jarque- 3.2 Univariate Time Series 61 Bera test. The Shapiro-Wilk’s test is a well-known goodness of fittestfor the normal distribution. It is attractive because it has a simple, graphical interpretation: one can think of it as an approximate measure of the cor- relation in a normal quantile-quantile plot of the data. The Jarque-Bera test is based on the result that a normally distributed random variable has skewness equal to zero and kurtosis equal to three. The Jarque-Bera test statistic is T 2 (kurt 3)2 JB = skew[ + − (3.3) 6 Ã 4 ! d where skew[ denotes the sample skewness and kurt denotes the sample kurtosis. Under the null hypothesis that the data is normally distributed d JB A χ2(2). ∼ Example 2 Testing for normality using the S+FinMetrics function normalTest The Shapiro-Wilks and Jarque-Bera statistics may be computed using the S+FinMetrics function normalTest. For the simulated GWN data of the previous example, these statistics are > normalTest(y, method="sw") Test for Normality: Shapiro-Wilks Null Hypothesis: data is normally distributed Test Statistics: Test Stat 0.9703 p.value 0.1449 Dist. under Null: normal Total Observ.: 100 > normalTest(y, method="jb") Test for Normality: Jarque-Bera Null Hypothesis: data is normally distributed Test Statistics: Test Stat 1.8763 p.value 0.3914 62 3. Time Series Concepts Dist. under Null: chi-square with 2 degrees of freedom Total Observ.: 100 The null of normality is not rejected using either test. Testing for White Noise Consider testing the null hypothesis 2 H0 : yt WN(0,σ ) ∼ against the alternative that yt is not white noise. Under the null, all of the autocorrelations ρj for j>0 are zero. To test this null, Box and Pierce (1970) suggested the Q-statistic k 2 Q(k)=T ρˆj (3.4) j=1 X whereρ ˆj is given by (3.2). Under the null, Q(k) is asymptotically dis- tributed χ2(k). In a finite sample, the Q-statistic (3.4) may not be well approximated by the χ2(k). Ljung and Box (1978) suggested the modified Q-statistic k ρˆ2 MQ(k)=T (T +2) j (3.5) T j j=1 X − which is better approximated by the χ2(k)infinite samples. Example 3 Daily returns on Microsoft Consider the time series behavior of daily continuously compounded re- turns on Microsoft for 2000. The following S-PLUS commands create the data and produce some diagnostic plots: > r.msft = getReturns(DowJones30[,"MSFT"],type="continuous") > r.msft@title = "Daily returns on Microsoft" > sample.2000 = (positions(r.msft) > timeDate("12/31/1999") + & positions(r.msft) < timeDate("1/1/2001")) > par(mfrow=c(2,2)) > plot(r.msft[sample.2000],ylab="r.msft") > r.acf = acf(r.msft[sample.2000]) > hist(seriesData(r.msft)) > qqnorm(seriesData(r.msft)) The daily returns on Microsoft resemble a white noise process. The qq- plot, however, suggests that the tails of the return distribution are fatter than the normal distribution. Notice that since the hist and qqnorm func- tions do not have methods for “timeSeries” objects the extractor func- tion seriesData is required to extract the data frame from the data slot of r.msft. 3.2 Univariate Time Series 63 Series : r.msft[sample.2000] Daily returns on Microsoft ACF r.msft -0.15 0.05 0.15 -0.2 0.2 0.4 0.6 0.8 1.0 Feb Apr Jun Aug Oct Dec 0 5 10 15 20 2000 2001 Lag seriesData(r.msft) -0.1 0.0 0.1 0 200 400 600 800 -0.1 0.0 0.1 -2 0 2 seriesData(r.msft) Quantiles of Standard Normal FIGURE 3.3.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages54 Page
-
File Size-