
Publication 1145 March 1986 Introduction to Time Series Analysis _ n_ _:_ I ,---i u_ ,_ Uc) o_ _::_- Jay C. Hardin _c) z t-4 Ij / LJ : I[ " " _o __)._ I--( r_ v • • • . L NASA Reference Publication 1145 1986 Introduction to Time Series Analysis Jay C. Hardin Langley Research Center Hampton, Virg_nta N/ A NalJonal ae¢onauhcs at_d Soace Aommis1ratlon Scientific and Technical Information Branch Preface This presentation of time series analysis techniques has been devel- oped by the author in the process of teaching (since 1971) a graduate level course on the subject to scientists, engineers, and computer an- alysts at NASA Langley Research Center. The intent is to develop, from the beginning, the basic understanding necessary to properly ap- ply modern spectral analysis techniques. The subject rests on a firm foundation in the theory of probability, which will be reviewed in this monograph. Thus, the only prerequisites are an ordinary engineering knowledge of calculus and some acquaintance with linear system theory. However, familiarity with random process theory, as provided in Prob- ability, Random Variables, and Stochastic Processes by Papoulis, and with Fourier analysis techniques, as provided in The Fourier Transform and Its Applications by Bracewell, would be helpful. Although there are many textbooks on time series analysis, several of which the author has used in his courses, this monograph takes a differ- ent approach from most. First, the theory in this presentation has been developed, insofar as possible, for continuous data. This postpones the inevitable use of discrete mathematics, which the author believes tends to obscure physical understanding, until after the reader has gained some familiarity with the concepts. Only then are the computational detailsfordigitaldata introduced.Second, the author assumes that most readerswillhave accessto eitherstandardcomputer softwareor hard-wiredspectralanalyzersto do thework ofcomputation. One big dangerof such standardanalysistechniques,however,isthat they will alwaysyieldan output,even ifthe inputdoes not satisfythe assump- tionson which the analysisisbased. Thus, thismonograph seeksto providethe theoreticaloverviewnecessaryto correctlyapply the full range of thesepowerfultechniques.Finally,time seriesanalysisisa vast and rapidlychanging field.In an attempt to remain complete and current,the lastchapterintroducesthereaderto many specialized techniquesand areaswhere researchispresentlyin progress. The author would liketo expresshisappreciationto William E. Zorumskiand StephenK. Park,who worked almostas hard in reviewing thismanuscriptas theauthor did in writingit. Jay C. Hardin NASA LangleyResearchCenter Hampton, VA 23665-5225 III__ PAGE BLANK NOT July 16, 1985 iii L [LLI ' ' " " " " ' II II II l; L U I/ E U Table of Contents Chapter I:Introduction .................. I i.i Why Harmonic Analysis? .............. i 1.2 Deterministic or Random? ..... _ ........ 4 Chapter If:Harmonic Analysis ............... 7 2.I Fourier Transform Pair ............... 7 2.2 Examples ..................... 8 2.3 Convolution Theorems ................ 11 Chapter HI: Random Process Theory ............ 15 3.I The Concept of a Random Process .......... 15 3.2 Random Variables .................. 22 3.3 Jointly Distributed Random Processes ......... 24 3.4 Stationary Random Processes ............ 26 Chapter IV: Power Spectral Analysis ............ 31 4.1 Propertiesof Power Spectral Densities ......... 32 4.2 Problems in Comparing Power Spectral Densities .... 33 4.3.Interpretationof Power Spectral Densities ....... 34 4.4 Relation Between the Power Spectral Density and the Fourier Transform of a Random Process .... 36 4.5 Cross Spectral Density ................ 38 Chapter V: Random Processes in Linear Systems ....... 41 5.1 Description of the System .............. 41 5.2 Properties of the Output Random Process ....... 42 5.3 Determination of Frequency Response Functions .... 45 5.4 The Coherence Function ............... 47 Chapter VI: Estimation Theory .............. 51 6.1 Estimation of a Parameter by a Random Variable .... 51 6.2 Estimation of Mean ................. 52 6.3 Estimation of Autocorrelation ............ 54 6.4 Estimation of Cross Correlation ............ 54 6.5 A Test forStationarity ................ 55 v _'ACIE iV ..... .l;'*i _;'_i _CfL_,LL'_"_L,_ii& PI:_CEDING PAGE BLANK NOT FILMED L -\ \. Chapter VII:Estimationof Power SpectralDensities ..... 57 7.I The Blackman-Tukey Approach ............ 57 7.2Windows ...................... 60 7.3The FiniteFourierTransform Approach ........ 64 7.4Frequency Resolution ................ 69 Chapter VIII:Uncertaintyin Power SpectralEstimates .... 77 8.1Understandingof Uncertalnt_ ........... 77 8.2Applicationof the Chi-SquareRandom Variable to SpectralEstimation ............. 78 8.3Block Average ................... 79 8.4UncertaintyAnalysisforthe Blackman-Tukey Technique ..................... 82 Chapter IX. DigitalTime SeriesAnalysis .......... 85 9.1 Shannon'sSampling Theorem ............ 85 9.2 The Nyquist Frequency and Alia.sing ......... 88 9.3EITectof Aliasingon Power SpectralDensity ...... 90 9.4Gibbs'Phenomenon ................. 94 9.5RelationshipBetween Continuousand Discrete FourierTransforms ................. 96 9.6DigitalBlackman-Tukey Estimation .......... 98 9.7DiscreteFiniteFourierTransform Estimation ...... I00 9.8Frequency Domain Window Insertion ......... 101 9.9AutocorrelationEstimationVia Discrete FourierTransformation .............. 104 9.10 Zero Insertion ................... 106 9.11 DigitalSpectralEstimationProcedure ........ 108 Chapter X: The FastFourierTransform ........... 111 10.1Theory of the Fast FourierTransform ......... 112 10.2Propertiesof the DiscreteFourierTransform forReal-ValuedData ............... 115 Chapter XI: DigitalFiltering ............... 118 11.1LinearFilters ................... I19 11.2RecursiveFilters.................. 123 Chapter XII:SpecialTopics ................ 131 12.1The Kendall SeriesmA Test Case .......... 131 12.2AR, MA, and ARMA Models ............ 136 12.3Data AdaptiveSpectralEstimationT_chniques .... 138 12.4SpectralAnalysisof Randomly Sampled Signals .... 144 12.5Cepstrum Analysis 147 12.6Zoom FFT .................. 152 vi L L i_ I, \ "\ 12.7 Digital Spectral Analysis of Periodic Signals ...... 156 12.8 Spectral Analysis of Nonstationary Random Processes ................. 160 vii L Symbols art_ 51,1 series coefficients d(t) data window function, equal to 0 for t < 0 ort>T E{ } expectation operator Fourier integral transform of f(t) (eq. (2.3)) / cyclic frequency, Hz f(t) real function of independent variable t first order density function of random process X(t) (eq. (3.1)) fx(zl, z2, ..., z,; nth order density function of random tl, t2, ..., t_) process X( t) fz_ joint density function of random processes X(t) and Y(t) H(_) fre_tuency response function of linear, shift- invariant system (eq. (5.2)) h(t) impulse response function of linear, shift- invariant system (eq. (5.3)) k number of degrees of freedom of chi-square random variable ,nx(t) mean value taken by random process X(t) at time t (eq. (3.2)) N number of samples (or data points) taken of a random process .vs number of blocks of data P{} probability of event { } P period of periodic signal f(t) Rx(t_, t2) autocorrelation of random process X(t) at times tl and t2 (eq. (3.5)) Rxy(tl,t2) cross correlation of random process X(t) at time tt and random process Y(t) at time t2 (eq. (3.11)) S chi-square random variable (eq. (3.9)) _CE_DING PAGE BLANK NOT FILMED [111 IJ E 1; L El/ E 11 15 1: lJ 1/ Sx(_) power spectraldensityof random process xct) (eq. (4.1)) Sxr(_) cross power spectral density of random processes X(t) and Y(t) (eel. (4.12)) sinc(x) sinc function, equal to (sin z)/x T length of data record rs lengthof data block T_ half-lengthof lagwindow T_ responsetime of linear,shift-invariant system t independentvariable,not necessarilytime Fouriertransformof lagwindow function (eq. (7.8)) lag window function WR window correction factor in autocorretation estimate ws window correction factor in spectral estimate x(o, r(t), z(t) random processes X(_), Y(_) Fourier transforms of random processes X(t) and Y(t) (eq. (4.10)) XF(,_) Fourier transform of random process X(t) through data window (eq. (7.16)) XT(W) finite Fourier transform of random process X(t), calculated from sample function of length T (eq. (7.12)) rx(tt, t2) covariance of random process X(t) at times tt and t2 af bandwidth of spectralestimate At samplinginterval bandwidth of spectralestimatein rad/sec 61¢.n Kroneckerdeltafunction X ,1[ _ v _ v - w_ v • _ L iLL ,. i_' -\ !1 11 It l/ 1i II _ Id U II Id I/ Id 1t Dirac delta function variance of random process X(t) at time t (eq. (3.4)) random pha_e angles of sinusoidalsignals 7" time lag,equal to t2 - tI ,*3 frequency,units are radians per second ift is time Nyquist frequency,equal to _r/At ,.o' n frequenciesof periodic function,equal to 2nlr/p for--oo < n < _;also, set of frequencies not necessarily related; also, set of frequencies at which spectral estimates are calculated complex conjugate estimate xi Chapter I Introduction Consider a record of length T of a real function f(t) as shown in figure I. By convention, the independent variable is called _time," although itneed not actuallybe time. Instead,the function may depend on distance or angle or any other variableof interest.The data record shown isof finitelength,sincethat isallthat isever availablein the real world, and need not be continuous but may, in fact,consistof
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages184 Page
-
File Size-