Cointegration and Antitrust: a Primer

Total Page:16

File Type:pdf, Size:1020Kb

Cointegration and Antitrust: a Primer Economics Committee Newsletter Cointegration and Antitrust: A Primer Jonathan L. Rubin, J.D., Ph.D.* American Antitrust Institute Introduction On October 8, 2003, Robert F. Engle and It is this last (and most technical) aspect of Clive W . J. Granger were awarded the cointegration which accounts for its Nobel Prize for their research on the influence in the econometric world. statistical analysis of economic time series. Cointegration methods will inevitably make Both made important contributions on their their way into the statistical analysis of own, but their most influential work by far is antitrust issues and, ultimately, into the contained in a short and elegant paper they courtroom. The purpose of this article is to published together in 1987.1 Their paper introduce the intuition behind cointegration influenced the way statisticians perform in the context of antitrust econometrics. The almost all regression analysis. focus will be the multivariate cointegration model pioneered by Johansen.2 Their insight, known as cointegration, has been described as a method of uncovering The Nature of Time Series and the long-run relationships between variables that Problem of Spurious Regression are concealed by the noise of short-term fluctuations. An engineer might look at this Econometric studies relevant to antitrust as disentangling the “signal” from the issues are often concerned with time series, “noise.” An economist could consider it a i.e., a list of n sequential observations, Xt = way of distinguishing between a random {x1, x2, x3, ..., xn} of a particular variable that fluctuation and a correction back to an varies over time. The graph of a typical equilibrium level. A statistician would price series is given in Fig. 1, which shows regard it as a way of doing regression the price of a particular variety and grade of analysis on non-stationary (i.e., lumber over a 22-year period. stochastically trending) variables that gives statistically valid results. Volume4, Number 1 10 Spring 2004 Economics Committee Newsletter Figure 1: Real Price of Lumber, 1975-1996 This time series consists of 88 quarterly understating it in the latter part. A strategy observations. The mean of the sample (i.e., involving waiting a quarter or two for the the average price) is $1,096.75, which is price to revert to the mean would nearly indicated on the graph by the horizontal always fail. Econometricians call this dotted line. What is most obvious about the property “non-stationarity,” and the price data in Fig. 1 is its tendency to move from variable in this case is said to be “non- the lower left of the graph to the upper right, stationary.” which is typical in any market in which prices tend to increase over time (the prices An example of a stationary variable would shown are real, that is, they have been be the time series defined as the difference corrected to eliminate the effect of of this price series, telling us to look at the inflation). The significance of this is that time series consisting of the differences of the sample mean summarizes the price quite the prices from one observation to the next. poorly. Except for the periods around 1983 The graph of the lumber prices in or mid-1993, the statement, “The average differences is shown in Fig. 2. price over the sample is $1,096.75” is fairly uninformative, greatly overstating the price in the earlier part of the sample while Volume 4, Number 1 11 Spring 2004 Economics Committee Newsletter Figure 2: Real Price of Lumber in Differences, 1975-1996 Again, the mean, or average, difference, in “BLUE,” provided that the assumptions this case $6.84, is indicated by the underlying the regression model are horizontal dotted line. While the price in fulfilled. Regression estimates that are not levels in Fig. 1 crossed the mean three times, BLUE are more likely to be excluded under the price in differences crosses the mean 42 the Daubert standard, and regression studies times. Clearly, the difference from any given involving non-stationary data are not BLUE quarter to the next may differ widely from because they do not fulfill the OLS the mean, a quarter or two later the assumptions. Econometric studies that do difference reverts to the average. Such a not take non-stationarity into account are mean-reverting series is said to be flawed, and have little probative value. It “stationary.” has been shown that regressing two non- stationary series leads to false positives, also The distinction between stationary and non- known as “spurious regression.”3 stationary time series is important because Econometric relationships that appear to be these data have dramatically different statistically significant in the presence of statistical properties. Standard regression non-stationarity may not, in fact, have any analysis, also known as “ordinary least meaningful relationship whatsoever. squares,” or “OLS,” is said to give the Best, Linear, Unbiased Estimates, i.e., they are Volume 4, Number 1 12 Spring 2004 Economics Committee Newsletter Integration and Cointegration Multivariate Cointegration The technical term for non-stationary time Cointegration theory reaches far beyond series is that they are “integrated.” The explaining, and being able to correct for, cause of such integration can be traced to the spurious regression. It also easily permits a accumulation of random influences on the superior approach to multiple regression variable. The simplest integrated process is modeling which virtually eliminates known as a “random walk.” The random simultaneity bias. Simultaneity bias in walk process is said to be integrated of order regression analysis results when causality one because, like the price series in Fig. 1, if runs not only from the explanatory variable it is differenced once it becomes stationary. to the dependent variable, but “feeds back” More generally, if a variable can be made from the dependent variable to the stationary by differencing it d times, it is explanatory variable, as well. This problem said to be integrated of order d. The concept is discussed in a widely available reference of an integrated time series has not only on scientific evidence, in which Professor been extended to higher orders of d, but to Rubinfeld states, fractional values of d as well. The assumption of no feedback is Ordinarily, the sum of two non-stationary especially important in litigation, because it is possible for the (integrated) time series is also non- defendant (if responsible, for stationary. On occasion, however, a unique example, for price-fixing or combination of two integrated time series discrimination) to affect the values results in a stationary time series, in which of the explanatory variables and case it is said that the data is cointegrated. thus to bias the usual statistical tests that are used in multiple regression.4 Intuitively, two series that are cointegrated may be individually non-stationary, but they The problem is illustrated by supposing that will not move too far apart over time. A the defendant’s expert wants to demonstrate common heuristic example of two that the price of a product, Pt, is determined cointegrated series is that of a drunk dog- by three variables: a demand variable, Dt, a owner walking in a desert. Assuming the cost variable, Ct, and advertising, At. owner is drunk enough to have no sense of Provided that the “no feedback” assumption direction (and does not double-back), his is fulfilled, the researcher might estimate a path might resemble a (non-stationary) multivariate model of the form random walk. His dog’s path might also look like a random walk. At any one time P D C A t = α + β1 t + β 2 t + β 3 t + εt . they may be close together, and at another time further apart, but over the long run they Setting aside for the time being the spurious will move together, and never take off in regression problem, if the explanatory opposite directions. Before any regression variables together account for all but analysis can be considered valid, therefore, residual random variations in the price, and the econometrician must be satisfied either the parameters, ßi, are all statistically that a) the regressors are stationary, or, b) significant, estimations from this model may the regressors are cointegrated. constitute sound statistical evidence. However, if demand also reacts to price, i.e., Pt also causes variations in Dt, then simultaneity bias will invalidate the results. Volume 4, Number 1 13 Spring 2004 Economics Committee Newsletter To remedy this, Dr. Rubinfeld suggests provide probative evidence, the cointegrated dropping the questionable variable to VAR model represents a superior approach. determine whether its exclusion makes a But because there are numerous contexts in difference, or expanding the model by which the interpretation of a stationary adding one or more equations that explain cointegrating process can be theoretically the feedback effect. meaningful, cointegration analysis can provide a wealth of other kinds of The cointegration approach generally solves information to a fact-finder. Of particular the problem by expanding the model into a interest to the antitrust practitioner is the system of equations in which each variable case in which statistical evidence is needed may influence every other variable. The to determine product or market delineation. statistical significance of the dependence of each variable on every other variable can The use of statistical correlation between then be tested. Instead of the researcher price series has be justifiably criticized assuming that Pt should be considered the (because of the spurious regression problem, dependent variable and that Dt, Ct, and At inter alia) as a means of determining should be the explanatory variables, the whether price realizations from potentially direction of causality as between each substitutable products or from different variable can be tested within the model to geographical areas belong to the same or arrive at a specification that does not suffer separate markets.5 But the cointegration from simultaneity bias.
Recommended publications
  • Testing for Cointegration When Some of The
    EconometricTheory, 11, 1995, 984-1014. Printed in the United States of America. TESTINGFOR COINTEGRATION WHEN SOME OF THE COINTEGRATINGVECTORS ARE PRESPECIFIED MICHAELT.K. HORVATH Stanford University MARKW. WATSON Princeton University Manyeconomic models imply that ratios, simpledifferences, or "spreads"of variablesare I(O).In these models, cointegratingvectors are composedof l's, O's,and - l's and containno unknownparameters. In this paper,we develop tests for cointegrationthat can be appliedwhen some of the cointegratingvec- tors are prespecifiedunder the null or underthe alternativehypotheses. These tests are constructedin a vectorerror correction model and are motivatedas Waldtests from a Gaussianversion of the model. Whenall of the cointegrat- ing vectorsare prespecifiedunder the alternative,the tests correspondto the standardWald tests for the inclusionof errorcorrection terms in the VAR. Modificationsof this basictest are developedwhen a subsetof the cointegrat- ing vectorscontain unknown parameters. The asymptoticnull distributionsof the statisticsare derived,critical values are determined,and the local power propertiesof the test are studied.Finally, the test is appliedto data on for- eign exchangefuture and spot pricesto test the stabilityof the forward-spot premium. 1. INTRODUCTION Economic models often imply that variables are cointegrated with simple and known cointegrating vectors. Examples include the neoclassical growth model, which implies that income, consumption, investment, and the capi- tal stock will grow in a balanced way, so that any stochastic growth in one of the series must be matched by corresponding growth in the others. Asset This paper has benefited from helpful comments by Neil Ericsson, Gordon Kemp, Andrew Levin, Soren Johansen, John McDermott, Pierre Perron, Peter Phillips, James Stock, and three referees.
    [Show full text]
  • Conditional Heteroscedastic Cointegration Analysis with Structural Breaks a Study on the Chinese Stock Markets
    Conditional Heteroscedastic Cointegration Analysis with Structural Breaks A study on the Chinese stock markets Authors: Supervisor: Andrea P.G. Kratz Frederik Lundtofte Heli M.K. Raulamo VT 2011 Abstract A large number of studies have shown that macroeconomic variables can explain co- movements in stock market returns in developed markets. The purpose of this paper is to investigate whether this relation also holds in China’s two stock markets. By doing a heteroscedastic cointegration analysis, the long run relation is investigated. The results show that it is difficult to determine if a cointegrating relationship exists. This could be caused by conditional heteroscedasticity and possible structural break(s) apparent in the sample. Keywords: cointegration, conditional heteroscedasticity, structural break, China, global financial crisis Table of contents 1. Introduction ............................................................................................................................ 3 2. The Chinese stock market ...................................................................................................... 5 3. Previous research .................................................................................................................... 7 3.1. Stock market and macroeconomic variables ................................................................ 7 3.2. The Chinese market ..................................................................................................... 9 4. Theory .................................................................................................................................
    [Show full text]
  • Lecture 18 Cointegration
    RS – EC2 - Lecture 18 Lecture 18 Cointegration 1 Spurious Regression • Suppose yt and xt are I(1). We regress yt against xt. What happens? • The usual t-tests on regression coefficients can show statistically significant coefficients, even if in reality it is not so. • This the spurious regression problem (Granger and Newbold (1974)). • In a Spurious Regression the errors would be correlated and the standard t-statistic will be wrongly calculated because the variance of the errors is not consistently estimated. Note: This problem can also appear with I(0) series –see, Granger, Hyung and Jeon (1998). 1 RS – EC2 - Lecture 18 Spurious Regression - Examples Examples: (1) Egyptian infant mortality rate (Y), 1971-1990, annual data, on Gross aggregate income of American farmers (I) and Total Honduran money supply (M) ŷ = 179.9 - .2952 I - .0439 M, R2 = .918, DW = .4752, F = 95.17 (16.63) (-2.32) (-4.26) Corr = .8858, -.9113, -.9445 (2). US Export Index (Y), 1960-1990, annual data, on Australian males’ life expectancy (X) ŷ = -2943. + 45.7974 X, R2 = .916, DW = .3599, F = 315.2 (-16.70) (17.76) Corr = .9570 (3) Total Crime Rates in the US (Y), 1971-1991, annual data, on Life expectancy of South Africa (X) ŷ = -24569 + 628.9 X, R2 = .811, DW = .5061, F = 81.72 (-6.03) (9.04) Corr = .9008 Spurious Regression - Statistical Implications • Suppose yt and xt are unrelated I(1) variables. We run the regression: y t x t t • True value of β=0. The above is a spurious regression and et ∼ I(1).
    [Show full text]
  • Testing Linear Restrictions on Cointegration Vectors: Sizes and Powers of Wald Tests in Finite Samples
    A Service of Leibniz-Informationszentrum econstor Wirtschaft Leibniz Information Centre Make Your Publications Visible. zbw for Economics Haug, Alfred A. Working Paper Testing linear restrictions on cointegration vectors: Sizes and powers of Wald tests in finite samples Technical Report, No. 1999,04 Provided in Cooperation with: Collaborative Research Center 'Reduction of Complexity in Multivariate Data Structures' (SFB 475), University of Dortmund Suggested Citation: Haug, Alfred A. (1999) : Testing linear restrictions on cointegration vectors: Sizes and powers of Wald tests in finite samples, Technical Report, No. 1999,04, Universität Dortmund, Sonderforschungsbereich 475 - Komplexitätsreduktion in Multivariaten Datenstrukturen, Dortmund This Version is available at: http://hdl.handle.net/10419/77134 Standard-Nutzungsbedingungen: Terms of use: Die Dokumente auf EconStor dürfen zu eigenen wissenschaftlichen Documents in EconStor may be saved and copied for your Zwecken und zum Privatgebrauch gespeichert und kopiert werden. personal and scholarly purposes. Sie dürfen die Dokumente nicht für öffentliche oder kommerzielle You are not to copy documents for public or commercial Zwecke vervielfältigen, öffentlich ausstellen, öffentlich zugänglich purposes, to exhibit the documents publicly, to make them machen, vertreiben oder anderweitig nutzen. publicly available on the internet, or to distribute or otherwise use the documents in public. Sofern die Verfasser die Dokumente unter Open-Content-Lizenzen (insbesondere CC-Lizenzen) zur Verfügung
    [Show full text]
  • SUPPLEMENTARY APPENDIX a Time Series Model of Interest Rates
    SUPPLEMENTARY APPENDIX A Time Series Model of Interest Rates With the Effective Lower Bound⇤ Benjamin K. Johannsen† Elmar Mertens Federal Reserve Board Bank for International Settlements April 16, 2018 Abstract This appendix contains supplementary results as well as further descriptions of computa- tional procedures for our paper. Section I, describes the MCMC sampler used in estimating our model. Section II describes the computation of predictive densities. Section III reports ad- ditional estimates of trends and stochastic volatilities well as posterior moments of parameter estimates from our baseline model. Section IV reports estimates from an alternative version of our model, where the CBO unemployment rate gap is used as business cycle measure in- stead of the CBO output gap. Section V reports trend estimates derived from different variable orderings in the gap VAR of our model. Section VI compares the forecasting performance of our model to the performance of the no-change forecast from the random-walk model over a period that begins in 1985 and ends in 2017:Q2. Sections VII and VIII describe the particle filtering methods used for the computation of marginal data densities as well as the impulse responses. ⇤The views in this paper do not necessarily represent the views of the Bank for International Settlements, the Federal Reserve Board, any other person in the Federal Reserve System or the Federal Open Market Committee. Any errors or omissions should be regarded as solely those of the authors. †For correspondence: Benjamin K. Johannsen, Board of Governors of the Federal Reserve System, Washington D.C. 20551. email [email protected].
    [Show full text]
  • The Role of Models and Probabilities in the Monetary Policy Process
    1017-01 BPEA/Sims 12/30/02 14:48 Page 1 CHRISTOPHER A. SIMS Princeton University The Role of Models and Probabilities in the Monetary Policy Process This is a paper on the way data relate to decisionmaking in central banks. One component of the paper is based on a series of interviews with staff members and a few policy committee members of four central banks: the Swedish Riksbank, the European Central Bank (ECB), the Bank of England, and the U.S. Federal Reserve. These interviews focused on the policy process and sought to determine how forecasts were made, how uncertainty was characterized and handled, and what role formal economic models played in the process at each central bank. In each of the four central banks, “subjective” forecasting, based on data analysis by sectoral “experts,” plays an important role. At the Federal Reserve, a seventeen-year record of model-based forecasts can be com- pared with a longer record of subjective forecasts, and a second compo- nent of this paper is an analysis of these records. Two of the central banks—the Riksbank and the Bank of England— have explicit inflation-targeting policies that require them to set quantita- tive targets for inflation and to publish, several times a year, their forecasts of inflation. A third component of the paper discusses the effects of such a policy regime on the policy process and on the role of models within it. The large models in use in central banks today grew out of a first generation of large models that were thought to be founded on the statisti- cal theory of simultaneous-equations models.
    [Show full text]
  • Lecture: Introduction to Cointegration Applied Econometrics
    Lecture: Introduction to Cointegration Applied Econometrics Jozef Barunik IES, FSV, UK Summer Semester 2010/2011 Jozef Barunik (IES, FSV, UK) Lecture: Introduction to Cointegration Summer Semester 2010/2011 1 / 18 Introduction Readings Readings 1 The Royal Swedish Academy of Sciences (2003): Time Series Econometrics: Cointegration and Autoregressive Conditional Heteroscedasticity, downloadable from: http://www-stat.wharton.upenn.edu/∼steele/HoldingPen/NobelPrizeInfo.pdf 2 Granger,C.W.J. (2003): Time Series, Cointegration and Applications, Nobel lecture, December 8, 2003 3 Harris Using Cointegration Analysis in Econometric Modelling, 1995 (Useful applied econometrics textbook focused solely on cointegration) 4 Almost all textbooks cover the introduction to cointegration Engle-Granger procedure (single equation procedure), Johansen multivariate framework (covered in the following lecture) Jozef Barunik (IES, FSV, UK) Lecture: Introduction to Cointegration Summer Semester 2010/2011 2 / 18 Introduction Outline Outline of the today's talk What is cointegration? Deriving Error-Correction Model (ECM) Engle-Granger procedure Jozef Barunik (IES, FSV, UK) Lecture: Introduction to Cointegration Summer Semester 2010/2011 3 / 18 Introduction Outline Outline of the today's talk What is cointegration? Deriving Error-Correction Model (ECM) Engle-Granger procedure Jozef Barunik (IES, FSV, UK) Lecture: Introduction to Cointegration Summer Semester 2010/2011 3 / 18 Introduction Outline Outline of the today's talk What is cointegration? Deriving Error-Correction Model (ECM) Engle-Granger procedure Jozef Barunik (IES, FSV, UK) Lecture: Introduction to Cointegration Summer Semester 2010/2011 3 / 18 Introduction Outline Robert F. Engle and Clive W.J. Granger Robert F. Engle shared the Nobel prize (2003) \for methods of analyzing economic time series with time-varying volatility (ARCH) with Clive W.
    [Show full text]
  • On the Power and Size Properties of Cointegration Tests in the Light of High-Frequency Stylized Facts
    Journal of Risk and Financial Management Article On the Power and Size Properties of Cointegration Tests in the Light of High-Frequency Stylized Facts Christopher Krauss *,† and Klaus Herrmann † University of Erlangen-Nürnberg, Lange Gasse 20, 90403 Nürnberg, Germany; [email protected] * Correspondence: [email protected]; Tel.: +49-0911-5302-278 † The views expressed here are those of the authors and not necessarily those of affiliated institutions. Academic Editor: Teodosio Perez-Amaral Received: 10 October 2016; Accepted: 31 January 2017; Published: 7 February 2017 Abstract: This paper establishes a selection of stylized facts for high-frequency cointegrated processes, based on one-minute-binned transaction data. A methodology is introduced to simulate cointegrated stock pairs, following none, some or all of these stylized facts. AR(1)-GARCH(1,1) and MR(3)-STAR(1)-GARCH(1,1) processes contaminated with reversible and non-reversible jumps are used to model the cointegration relationship. In a Monte Carlo simulation, the power and size properties of ten cointegration tests are assessed. We find that in high-frequency settings typical for stock price data, power is still acceptable, with the exception of strong or very frequent non-reversible jumps. Phillips–Perron and PGFF tests perform best. Keywords: cointegration testing; high-frequency; stylized facts; conditional heteroskedasticity; smooth transition autoregressive models 1. Introduction The concept of cointegration has been empirically applied to a wide range of financial and macroeconomic data. In recent years, interest has surged in identifying cointegration relationships in high-frequency financial market data (see, among others, [1–5]). However, it remains unclear whether standard cointegration tests are truly robust against the specifics of high-frequency settings.
    [Show full text]
  • Making Wald Tests Work for Cointegrated VAR Systems *
    ECONOMETRIC REVIEWS, 15(4), 369-386 (1996) Making Wald Tests Work for Cointegrated VAR Systems * Juan J. Dolado and Helmut Lütkepohl CEMF1 Humboldt-Universitiit Casado del Alisal, 5 Spandauer Strasse 1 28014 Madrid, Spain 10178 Berlin, Germany Abstract Wald tests of restrictions on the coefficients of vector autoregressive (VAR) processes are known to have nonstandard asymptotic properties for 1(1) and cointegrated sys- tems of variables. A simple device is proposed which guarantees that Wald tests have asymptotic X2-distributions under general conditions. If the true generation process is a VAR(p) it is proposed to fit a VAR(p+1) to the data and perform a Wald test on the coefficients of the first p lags only. The power properties of the modified tests are studied both analytically and numerically by means of simple illustrative examples. 1 Introduction Wald tests are standard tools for testing restrictions on the coefficients of vector au- toregressive (VAR) processes. Their conceptual simplicity and easy applicability make them attractive for applied work to carry out statistical inference on hypotheses of interest. For instance, a typical example is the test of Granger-causality in the VAR *The research for this paper was supported by funds frorn the Deutsche Forschungsgerneinschaft, Sonderforschungsbereich 373. We are indebted to R. Mestre for invaluable research assistance. We have benefited frorn cornrnents by Peter C.B. Phillips, Enrique Sentana, Jürgen Wolters, three anony- rnous referees and participants at the European Meeting of the Econornetric Society, Maastricht 1994. 369 1 Copyright © 1996 by Mareel Dekker, Ine. 370 DOLADO AND LÜTKEPOHL framework where the null hypothesis is formulated as zero restrictions on the coeffi- cients of the lags of a subset of the variables.
    [Show full text]
  • On the Performance of a Cointegration-Based Approach for Novelty Detection in Realistic Fatigue Crack Growth Scenarios
    This is a repository copy of On the performance of a cointegration-based approach for novelty detection in realistic fatigue crack growth scenarios. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/141351/ Version: Accepted Version Article: Salvetti, M., Sbarufatti, C., Cross, E. orcid.org/0000-0001-5204-1910 et al. (3 more authors) (2019) On the performance of a cointegration-based approach for novelty detection in realistic fatigue crack growth scenarios. Mechanical Systems and Signal Processing, 123. pp. 84-101. ISSN 0888-3270 https://doi.org/10.1016/j.ymssp.2019.01.007 Article available under the terms of the CC-BY-NC-ND licence (https://creativecommons.org/licenses/by-nc-nd/4.0/). Reuse This article is distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs (CC BY-NC-ND) licence. This licence only allows you to download this work and share it with others as long as you credit the authors, but you can’t change the article in any way or use it commercially. More information and the full terms of the licence here: https://creativecommons.org/licenses/ Takedown If you consider content in White Rose Research Online to be in breach of UK law, please notify us by emailing [email protected] including the URL of the record and the reason for the withdrawal request. [email protected] https://eprints.whiterose.ac.uk/ Title On the performance of a cointegration-based approach for novelty detection in realistic fatigue crack growth scenarios Abstract Confounding influences, such as operational and environmental variations, represent a limitation to the implementation of Structural Health Monitoring (SHM) systems in real structures, potentially leading to damage misclassifications.
    [Show full text]
  • Low-Frequency Robust Cointegration Testing∗
    Low-Frequency Robust Cointegration Testing∗ Ulrich K. Müller Mark W. Watson Princeton University Princeton University Department of Economics Department of Economics Princeton, NJ, 08544 and Woodrow Wilson School Princeton, NJ, 08544 June 2007 (This version: August 2009) Abstract Standard inference in cointegrating models is fragile because it relies on an assump- tion of an I(1) model for the common stochastic trends, which may not accurately describe the data’s persistence. This paper discusses efficient low-frequency inference about cointegrating vectors that is robust to this potential misspecification. A sim- pletestmotivatedbytheanalysisinWright(2000)isdevelopedandshowntobe approximately optimal in the case of a single cointegrating vector. JEL classification: C32, C12 Keywords: least favorable distribution, power envelope, nuisance parameters, labor’s share of income ∗We thank participants of the Cowles Econometrics Conference, the NBER Summer Institute and the Greater New York Metropolitan Area Econometrics Colloquium, and of seminars at Chicago, Cornell, North- western, NYU, Rutgers, and UCSD for helpful discussions. Support was provided by the National Science Foundation through grants SES-0518036 and SES-0617811. 1 Introduction The fundamental insight of cointegration is that while economic time series may be individ- ually highly persistent, some linear combinations are much less persistent. Accordingly, a suite of practical methods have been developed for conducting inference about cointegrat- ing vectors, the coefficients that lead to this reduction in persistence. In their standard form, these methods assume that the persistence is the result of common I(1) stochastic trends,1 and their statistical properties crucially depend on particular characteristics of I(1) processes. But in many applications there is uncertainty about the correct model for the per- sistence which cannot be resolved by examination of the data, rendering standard inference potentially fragile.
    [Show full text]
  • Cointegration - General Discussion
    Coint - 1 Cointegration - general discussion Definitions: A time series that requires d differences to get it stationary is said to be "integrated of order d". If the dth difference has p AutoRegressive and q Moving Average terms, the differenced series is said to be ARMA(p,q) and the original Integrated series to be ARIMA(p,d,q). Two series Xtt and Y that are integrated of order d may, through linear combination, produce a series +\>> ,] which is stationary (or integrated of order smaller than d) in which case we say that \]>> and are cointegrated and we refer to Ð+ß ,Ñ as the cointegrating vector. Granger and Weis discuss this concept and terminology. An example: For example, if \]>> and are wages in two similar industries, we may find that both are unit root processes. We may, however, reason that by virtue of the similar skills and easy transfer between the two industries, the difference \]>> - cannot vary too far from 0 and thus, certainly should not be a unit root process. The cointegrating vector is specified by our theory to be Ð"ß "Ñ or Ð "ß "Ñ, or Ð-ß -Ñ all of which are equivalent. The test for cointegration here consists of simply testing the original series for unit roots, not rejecting the unit root null , then testing the \]>> - series and rejecting the unit root null. We just use the standard D-F tables for all these tests. The reason we can use these D-F tables is that the cointegrating vector was specified by our theory, not estimated from the data.
    [Show full text]