Lecture 3: Spectral Analysis∗

Lecture 3: Spectral Analysis∗

Lecture 3: Spectral Analysis∗ Any covariance stationary process has both a time domain representation and a spectrum do- main representation. So far, our analysis is in the time domain as we represent a time series {xt} in terms of past values of innovations and investigate the dependence of x at distinct time. In some cases, a spectrum-domain representation is more convenient in describing a process. To transform a time-domain representation to a spectrum-domain representation, we use the Fourier transform. 1 Fourier Transforms Let ω denote the frequency (−π < ω < π), and let T denote the period: the minimum time that it takes the wave to go through a whole cycle, and we have T = 2π/ω. Given any integer number z, we have x(t) = x(t + zT ). Finally, we will let φ denote the phase: the amount that a wave is shifted. Given a time series {xt}, its Fourier transformation is: ∞ 1 X x(ω) = e−itωx(t) (1) 2π t=−∞ and the inverse Fourier transform is: Z π x(t) = eitωx(ω)dω (2) −π 2 Spectrum Recall that the autocovariance function for a zero-mean stationary process {xt} is defined as: γx(h) = E(xtxt−h) and it serves to characterize the time series {xt}. The spectrum of {x} is defined to be the Fourier transform of γx(h), ∞ 1 X S (ω) = e−ihωγ (h) (3) x 2π x h=−∞ P∞ h Recall that the autocovariance generating function is gx(z) = h=−∞ γx(h)z , if we let z = e−iω, then the spectrum is just the autocovariance generating function divided by 2π. In (3), if we take ω = 0, we see that ∞ X γx(h) = 2πSx(0), h=−∞ ∗Copyright 2002-2006 by Ling Hu. 1 which tells that the sum of autocorrelations equals the spectrum at zero multiplied by 2π. Using the identity eiφ = cos φ + i sin φ, we can also write (3) as " ∞ # 1 X S (ω) = γ + 2 γ (h) cos(hω) . (4) x 2π 0 x h=1 Note that since cos(ω) = cos(−ω), and γx(h) = γx(−h), the spectrum is symmetric about zero. Also the cosine function is periodic with period 2π, therefore, for spectral analysis, we only need to find the spectrum for ω ∈ [0, π]. Now if we know γx(h), we can compute its spectrum using (4), and if we know the spectrum Sx(ω), we can compute γx(h) using the inverse Fourier transform: Z π iωh γx(h) = e Sx(ω)dω (5) −π Let h = 0, then (5) gives the variance of {xt} Z π γx(0) = Sx(ω)dω. −π So the variance of {xt} is just the sum of the spectrum over all frequencies −π < ω < π. Therefore we can see that the spectrum function Sx(ω) decomposes the variance into components contributed from each frequency. In other words, we can use spectrum to find the importance of cycles of different frequencies. If we normalize the spectrum Sx(ω) by dividing γx(0), we get the Fourier transform of the autocorrelation function ρx(h), ∞ 1 X f(ω) = e−ihωρ (h) (6) 2π x h=−∞ The autocorrelation functions can be generated from f(ω) using the inverse transform Z π iωh ρx(h) = e fx(ω)dω (7) −π Again, let h = 0, (7) gives Z π 1 = fx(ω)dω −π Note that f(ω) is positive and integrate to one, just like a probability distribution density, so we call it spectral density. 2 2 Example 1 (spectral density of white noise) Let ∼ WN(0, σ ). We have γ(0) = σ and γ(h) = 0 for h 6= 0. Using (3) and (6), we can compute 1 1 S (ω) = γ (0) = σ2. 2π 2π Divide it by γ(0), we have 1 f () = . x 2π So the spectral density is uniform over [−π, π], i.e., every frequency has equal contribution to the variance. 2 3 Spectrum of Filtered Process Considering that the spectrum of a white noise process is so simple, we may want to know if we could make use it for a more complicated process, say, ∞ X xt = θkt−k = θ(L)t. k=−∞ We call this process a two-sided moving average process. Then what is the relationship between Sx(ω) and S(ω)? The general solution is given in the following statement. Proposition 1 If {xt} is a zero mean stationary process with spectrum function Sx(ω), and {yt} is the process ∞ X yt = θkxt−k = θ(L)xt k=−∞ where θ is absolutely summable, then 2 ∞ X −ikω −iω 2 Sy(ω) = θke Sx(ω) = θ(e ) Sx(ω). k=−∞ Proof: We start from the autocovarinace function of y, γy(h) = E(ytyt−h) ∞ ∞ X X = E θjxt−j θkxt−h−k j=−∞ k=−∞ ∞ X = θjθkE(xt−jxt−h−k) j,k=−∞ ∞ X = θjθkγx(h + k − j) j,k=−∞ Next, consider the spectrum of y, ∞ 1 X S (ω) = e−ihωγ (h) y 2π y h=−∞ ∞ ∞ 1 X X = e−ihω θ θ γ (h + k − j) 2π j k x h=−∞ j,k=−∞ 1 P∞ −ilω (Let l = h + k − j and note that Sx(ω) = 2π l=−∞ e γx(l), so we want to construct such a term and see what are the remainings.) ∞ ∞ ∞ ! X X 1 X S (ω) = e−ijωθ eikωθ e−ilωγ (l) y j k 2π x j=−∞ k=−∞ l=−∞ −iω iω = θ(e )θ(e )Sx(ω) −iω −iω = θ(e )θ(e )Sx(ω) −iω 2 = θ(e ) Sx(ω) 3 Example 2 To apply this results, first consider the problem of computing an MA(1) process, xt = t + θt−1 = (1 + θL)t. In this problem, θ(e−iω) = 1 + θe−iω, thus −iω 2 −iω iω θ(e ) = (1 + θe )(1 + θe ) = 1 + θ2 + θ(e−iω + eiω) Therefore, −iω 2 Sx(ω) = θ(e ) S(ω) 1 = [1 + θ2 + θ(e−iω + eiω)]σ2 2π We can verify this result by using the spectrum to compute the autocovarinace function, say, γx(1). Using (5). Z π iω γx(1) = e Sx(ω)dω −π Z π 1 2 iω 2 −iω iω = σ e [1 + θ + θ(e + e )]dω 2π −π 1 = σ2 · 2πθ 2π 2 = θσ which is the same as what we got from working in the time domain. In the computation we use R π iω the fact the −π e dω = 0, as the integral of sine or cosine functions all the way around a circle is zero. Figure 1 plots the spectrum of MA(1) processes with positive and negative coefficients. When θ > 0, we see that the spectrum is high for low frequencies and low for high frequencies. When θ < 0, we observe the opposite. This is because when θ is positive, we have positive one lag correlation which makes the series smooth with only small contribution from high frequency (say, day to day) components. When θ is negative, we have negative one lag correlation, therefore the series fluctuates rapidly about its mean value. Above we have considered the moving average process, the next proposition gives results for an ARMA models with white noise errors: Proposition 2 Let {xt} be an ARMA(p, q) process satisfying φ(L)xt = θ(L)t 4 0.4 0.4 0.35 0.35 0.3 0.3 0.25 0.25 0.2 0.2 Spectrum Spectrum 0.15 0.15 0.1 0.1 0.05 0.05 0 0 0 1 2 3 4 0 1 2 3 4 Frequency Frequency Figure 1: Plots of the spectrum of MA(1) processes (θ = 0.5 for the left figure and θ = −0.5 for the right figure) 2 where ∼ WN(0, σ ), all roots of φ(L) lies out of the unit circle, then the spectrum of xt is: |θ(e−iω)|2 S (ω) = S (ω) x |φ(e−iω)|2 1 |θ(e−iω)|2 = σ2 2π |φ(e−iω)|2 Example 3 Consider an AR(1) process, xt = φxt−1 + t. Using the above proposition, σ2 S (ω) = |1 − φe−iω|−2 x 2π σ2 = (1 + φ2 − 2φ cos ω)−1 (8) 2π Figure 2 plots the AR(1) processes with positive and negative coefficients. We have similar observations here as the MA processes. However, note that when φ → 1, Sx(ω) → ∞, which means that a random walk process has an infinite spectrum at frequency zero. This is similar as we are working with summation and differencing. When we add up a white noise (say, φ = 1 as in a random walk), the high frequencies are smoothed out (those spikes in the white noise disappear) and what is left is the long term stochastic trend. On the contrary, when we do differencing (say, do first differencing to a random walk, then we are back to the white noise series), we get rid of the long term trend, and what is left is the high frequencies (lots of spikes with mean zero, say). Finally we introduce a spectral representation theorem without proof. For zero-mean stationary process with absolutely summable autocovariances, define random variables α(ω) and δ(ω), we could represent the series in the form Z π xt = [α(ω) cos(ωt) + δ(ω) sin(ωt)]dω. 0 5 0.7 0.7 0.6 0.6 0.5 0.5 0.4 0.4 Spectrum 0.3 Spectrum 0.3 0.2 0.2 0.1 0.1 0 0 0 1 2 3 4 0 1 2 3 4 Frequency Frequency Figure 2: Plots of the spectrum of AR(1) processes (φ = 0.5 for the left figure and φ = −0.5 for the right figure) where α(ω) and δ(ω) have zero mean and are mutually and serially uncorrelated.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    11 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us