Journal of Statistical and Econometric Methods, vol.4, no.1, 2015, 63-71 ISSN: 2241-0384 (print), 2241-0376 (online) Scienpress Ltd, 2015 Some basic properties of cross-correlation functions of n-dimensional vector time series Anthony E. Usoro1 Abstract In this work, cross-correlation function of multivariate time series was the interest. The design of cross-correlation function at different lags was presented. is the matrix of the cross-covariance functions, and are the variances훾푋푖푡+푘푋푗푡 of+푙 and vectors respectively. Vector cross-correlation훾푋푖푡 훾function푋푗푡 was derived as 푖푡 푗푡 푋 , 푋 = . A statistical package was used to verify the vector cross 훾푋푖푡+푘푋푗푡+푙 푋푖푡+푘 푋푗푡+푙 휌 훾푋푖푡훾푋푗푡 correlation functions,� with trivariate analysis as a special case. From the results, some properties of vector cross-correlation functions were established. Keywords: Vector time series; cross-covariance function and cross-correlation function 1 Department of Mathematics and Statistics, Akwa Ibom State University, Mkpat Enin, Akwa Ibom State. Email: [email protected] Article Info: Received : January 17, 2015. Revised : February 28, 2015. Published online : March 31, 2015. 64 Cross-correlation functions of n-dimensional vector time series 1 Introduction In statistics, the term cross-covariance is sometimes used to refer to the covariance corr(X,Y) between two random vectors and , (where = , , … , = , , … , ). In signal processing,푋 the cross푌 -covariance푋 is 푋often1 푋2 called푋푛 푎푛푑cross푌-correlation푌1 푌2 and푌푛 is a measure of similarity of two signals, commonly used to find features in an unknown signal by comparing it to a known one. It is a function of the relative time between the signals, and it is sometimes called the sliding dot product. In univariate time series, the autocorrelation of a random process describes the correlation between values of the process at different points in time, as a function of the two times or of the time difference. Let be some repeatable process, and be some point in time after the start of that 푋process. ( may be an integer for a discrete푖 -time process or a real number for a continuous-time푖 process.) Then is the value (or realization) produced by a given run of the process at time . Suppose푋푖 that the process is further known to have defined values for mean 푖and variance for all times . Then the definition of 2 the autocorrelation between휇푖 times and 휎is푖 푖 푠 {(푡 )( )] ( , ) = , 퐸 푋푡−휇푡 푋푠−휇푠 휌 푠 푡 휎푡휎푠 where “E” is the expected value operator. It is required to note that the above expression is not well-defined for all time series or processes, because the variance may be zero. If the function is well-defined, its value must lie in the range [-1,1], with 1 indicating perfect 휌correlation and -1 indicating perfect anti- correlation. If is a second-order stationary process then the mean and the variance are푋 time-independent, and further the autocorrelation depends휇 only on 2 the difference휎 between and : the correlation depends only on the time-distance between the pair of values푡 but푠 not on their position in time. This further implies that the autocorrelation can be expressed as a function of the time-lag, and that Anthony E. Usoro 65 this would be an even function of the lag = , which implies = + . This gives the more familiar form, 푘 푠 − 푡 푠 푡 푘 [( )( )] [( )( )] = = [( ) ] [( ) ] 퐸 푋푡−휇 푋푡+푘−휇 퐸 푋푡−휇 푋푡+푘−휇 2 2 2 휌푘 √퐸 푋푡−휇 퐸 푋푡+푘−휇 휎 where and are time series process at lag k time difference. Hence, autocovariance푋푡 푋 coefficient푡+푘 at lag k, measures the covariance between two values and , a distance훾푘 k apart. The autocorrelation coefficient is defined as the 푍autocovariance푡 푍푡+푘 at lag k divided by variance ( ). The휌 푘plot of against lag k is called훾 푘the autocovariance function ( )훾,0 while푘=0 the plot of 훾푘 against lag k is called the autocorrelation function (Box 훾and푘 Jenkins 1976). 휌푘 In multivariate time series, cross-correlation or covariance involves more than one process. For instance, and are two processes of which could be cross-correlated with at lag k.푋 The푡 lag푌푡 k value return by ( , ) estimates푋푡 the ( ) correlation between 푌푡 + ( ), Venables and Ripley푐푐푓 (2002).푋 푌 Storch and Zwiers (2001) described푋 푡 cross푘 푎푛푑-correlation푌 푡 in signal processing and time series. In signal processing, cross-correlation is a measure of similarity of two waveforms as a function of a time lag applied to one of them. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a -duration signal for a shorter known feature. It also has application in pattern recognition, signal particle analysis, electron tomographic averaging, cryptanalysis and neurophysiology. In autocorrelation, which is the cross-correlation of a signal with itself, there is always a peak at a lag of zero unless the signal is a trivial zero signal. In probability theory and Statistics, correlation is always used to include a standardising factor in such a way that correlations have values between -1 and 1. Let ( , ) represent a pair of stochastic process that are jointly wide sense stationary.푋푡 푌푡 Then the cross covariance given by Box et al (1984) is ( ) = [( ) ], 훾푥푦 휏 퐸 푋푡 − 휇푥 �푌푡+휏 − 휇푦� 66 Cross-correlation functions of n-dimensional vector time series where and are the means of and respectively. The cross-correlation function휇 푥 is휇 the푦 normalized cross-푋covariance푡 푌푡 function. Therefore, 휌푥푦 ( ) ( ) = 훾푥푦 휏 휌푥푦 휏 where and are the standard deviation 휎of푥휎 processes푦 and respectively. If = 휎for푥 all 휎t,푦 then the cross-correlation function is simply푋푡 the푌푡 autocorrelation 푋function푡 푌푡 for a discrete process of length n defined as { , … , } which known mean and variance, an estimate of the autocorrelation may푋 be1 obtained푋푛 as 1 = 푛−푘( )( ) ( ) ( ) 푅� 푘 2 � 푋푡 − 휇 푋푡+푘 − 휇 for any positive integer k<n,푛 Patrick− 푘 휎 (2005)푡=1 . When the true mean and variance are known, the estimate is unbiased. If the true mean, this estimate휇 is unbiased. 2 휎If the true mean and variance of the process are not known, there are several probabilities: i. if and are replaced by the standard formulas for sample mean and sample 2 variance,휇 then휎 this is a biased estimate. ii. if n-k in the above formula is replaced with n, the estimate is biased. However, it usually has a smaller mean square error, Priestly (1982) and Donald and Walden (1993). iii. if is stationary process, then the following are true 푡 푋 = = , for all t,s and ( , ) = ( ) = ( ), 푡 푠 푥푥 푡 푠 푥푥 푠−푡 푥푥 푇 where T=s-t,휇 is the휇 lag휇 time or the moment퐶 of time 퐶by which the퐶 signal has been shifted. As a result, the autocovariance becomes ( ) = [ ( ) ( ) ] = ( ) ( ) = ( ) , 2 2 푥푥 푡 푡+푇 푡 푡+푇 푥푥 푇 where퐶 푇 represents퐸 �푋 the− 휇autocorrelation��푋 − 휇� in the퐸� 푋signal푋 processi� − 휇ng sense.푅 − 휇 ( ) ( )푅=푥푥 , Hoel (1984). 퐶푥푥 푇 2 푅푥푥 푇 휎 Anthony E. Usoro 67 For and , the following properties hold: 푡 푡 1. 푋( ) 1푌 푥푦 ℎ 2. 휌 ( )≤= ( ) 푥푦 ℎ 푥푦 −ℎ 3. 휌 ( ) 1휌 푥푦 0 휌 ≠ ( ) 4. ( ) = 훾(푥푦) ℎ ( ) 휌푥푦 ℎ √훾푥 0 훾푦 0 Mardia and Goodall (1993) defined separable cross-correlation function as ( , ) = ( , ) , 푖푗 1 2 1 2 푖푗 where = [ ] is a ×퐶 푋positive푋 definite휌 푋 푋 matrix푎 and (. , . ) is a valid correlation퐴 function.푎푖푗 Goulard푝 푝 & Voltz (1992); Wackernage (2003);휌 Ver Hoef and Barry (1998) implied that the cross- covariance function is ( ) = ( ) , 푟 푖푗 1 2 푘=1 푘 1 2 푖푘 푗푘 for an integer 1 퐶 ,푋 where− 푋 =∑[ ] 휌is a 푋 ×− 푋 full푎 rank푎 matrix and (.) are valid stationary ≤correlation푟 ≤ 푝 functions.퐴 푎Apanasovich푖푗 푝 푟 and Genton (2010) constructed휌푘 valid parametric cross-covariance functions. Apanasovich and Genton proposed a simple methodology based on latent dimensions and existing covariance models for univariate covariance, to develop flexible, interpretable and computationally feasible classes of cross-covariance functions in closed forms. They discussed estimation of the models and performed a small simulation study to demonstrate the models. The interest in this work is to extend cross-correlation functions beyond a-two variable case, present the multivariate design of vector cross- covariance and correlation functions and therefore establish some basic properties of vector cross-correlation functions from the analysis of vector cross-correlation functions. 68 Cross-correlation functions of n-dimensional vector time series 2 The design of cross-covariance cross-correlation functions The matrix of cross-covariance functions is as shown below: ( ), ( ) ( ), ( ) ( ), ( ) . ( ), ( ) 푋 1푡+푘 푋 1푡+푙 푋 1푡+푘 푋 2푡+푙 푋 1푡+푘 푋 3푡+푙 푋 1푡+푘 푋 푛푡+푙 ( 훾 ), ( ) ( 훾 ), ( ) ( 훾 ), ( ) . ( 훾 ), ( ) 푋 2푡+푘 푋 1푡+푙 푋 2푡+푘 푋 2푡+푙 푋 2푡+푘 푋 3푡+푙 푋 2푡+푘 푋 푛푡+푙 훾 ( ), ( ) 훾 ( ), ( ) 훾 ( ), ( ) . 훾 ( ), ( ) 푋 3푡+푘 푋 1푡+푙 푋 3푡+푘 푋 2푡+푙 푋 3푡+푘 푋 3푡+푙 푋 3푡+푘 푋 푛푡+푙 훾 . 훾 훾 훾 . ( ), ( ) ( ), ( ) ( ), ( ) . ( ), ( ) 푋 푚푡+푘 푋 1푡+푙 푋 푚푡+푘 푋 2푡+푙 푋 푚푡+푘 푋 3푡+푙 푋 푚푡+푘 푋 푛푡+푙 where 훾= 0, … , , = 0훾, … , . 훾 훾 푘The above푎 matrix푙 is a 푏square matrix, and could be reduced to the form, ( ), ( ) 푋 푖푡+푘 푋 푗푡+푙 where = 1, … , , = 1, … , , 훾= 0, … , , = 0, … , , ( = ). From the푖 above cross푚 푗 -covariance푛 푘 matrix, 푎 푙 푏 푛 푚 , = , 훾푋푖푡+푘푋푗푡+푙 푋푖푡+푘 푋푗푡+푙 휌 �훾푋푖푡훾푋푗푡 where, is the matrix of the cross-covariance functions, and are the 훾variances푋푖푡+푘푋푗푡+푙 of and vectors respectively.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages9 Page
-
File Size-