Journal of Economic Literature 2008, 46:3, 685–703 http:www.aeaweb.org/articles.php?doi=10.1257/jel.46.3.685

Horizons of Understanding: A Review of Ray Fair’s Estimating How the Macroeconomy Works

Jesús Fernández-Villaverde*

Ray Fair’s Estimating How the Macroeconomy Works is the latest in a series of books by Fair that build, estimate, and apply his macroeconometric model to study the U.S. . In this book, Fair updates the model to incorporate the most recent data and uses it to analyze several important empirical questions, such as whether the U.S. economy moved into a new age of high productivity in the last half of the 1990s and the dynamics of prices, output, and unemployment. This review places his work in the context of the historical evolution of aggregate econometric models, compares it with the current developments in the estimation of dynamic stochastic general equilibrium models, and discusses some salient aspects of Fair’s contributions.

1 Introduction Jan Tinbergen (1937 and 1939), researchers have aspired to build a coherent representa- ames C. Scott, in his fascinating book tion of the aggregate economy. Such a model JSeeing Like a State, remarks that a perva- would be the most convenient tool for many sive high modernist ideology underlies much tasks, which, somewhat crudely, we can clas- of modern social sciences. Scott defines high sify into the trinity of analysis, policy evalua- modernism as a strongly self-confident belief tion, and forecasting. By analysis, I mean the in scientific discovery and the subsequent exercises involved in studying the behavior rational design of a social order commensu- and evolution of the economy, running coun- rate with that knowledge. terfactual experiments, etc. Policy evaluation There is no purer example of the high is the assessment of the economy’s responses modernism impulse in than the in the short, middle, and long run to changes vision behind the design and estimation of in policy variables such as taxes, government macroeconometric models. For over seventy expenditure, or monetary aggregates. By years, since the pioneering investigations of forecasting, I mean the projections of future values of relevant variables. All those exer- * Fernández-Villaverde: University of Pennsylvania, cises are manifestations of high modernism in NBER, and CEPR. The title of this essay is inspired action: instances of formal understanding and by the work of Hans-Georg Gadamer. His emphasis on the historical interpretation of texts guides much of the deliberate control. Not surprisingly, the politi- review. I thank the NSF for financial support. cal compromises of many of the ­ 685 686 Journal of Economic Literature, Vol. XLVI (September 2008) originally engaged in the construction of mac- and Analysis of Macroeconometric Models roeconometric models revealed a common and Testing Macroeconometric Models, in my predisposition toward a deep reorganization library.1 of the economic life of western countries. Reading the books as a trilogy is a reward- Since Tinbergen’s contributions, much ing endeavor that yields a comprehen- effort has been invested in this intellectual sive view of what the Cowles Commission bet. What do we have to show after all these approach is all about, how to implement it, years? Have we succeeded? Have all the what it can deliver, and how it compares with costs been worthwhile? Or has the project the alternative methodologies existing in the been a failure, as with the other examples of profession, in particular with the increasingly high modernist hubris that Scott compiles? popular estimation of dynamic equilibrium Ray C. Fair’s Estimating How the Macro- . But, perhaps to understand this economy Works ( Press, comparison better, it is necessary to glance at 2004) starts from the unshakable premise the history of macroeconometric models. that the task has been valuable, that in fact we have a working model of the U.S. econ- 2 Great Expectations omy or, at least, a good approximation to it. Moreover, he states that we have achieved It is difficult to review Lawrence R. Klein the goal by following a concrete path: the and Arthur S. Goldberger’s 1955 mono- one laid down by the Cowles Commission graph, An Econometric Model of the United simultaneous equations framework. In the States, 1929–1952, without sharing the preface of a previous book, Fair (1994) char- enthusiasm of the authors about the possi- acterized his research as a rallying cry for bility of building a successful representation the Cowles Commission approach. His new of the U.S. economy. In barely 160 pages, monograph keeps the old loyalties by updat- Klein and Goldberger developed an oper- ing the multicountry (MC) model that he ating model that captured what was then has been focusing on since 1974 (chapters considered the frontier of macroeconomics; 1 and 2) and by applying it to answer topi- estimated it using limited-information maxi- cal questions, such as the effects of nominal mum likelihood; and described its different and real interest rates on consumption and applications for analysis, policy evaluation, investment, the dynamics of prices, wages, and forecasting. Premonitory proposals, and unemployment in the , or such as Tinbergen’s Statistical Testing of whether the second half of the 1990s wit- Business-Cycle Theories or Klein’s Economic nessed the appearance of a new economy Fluctuations in the United States, 1921–1941, (chapters 3 to 12). although promising, did not really deliver a Fair’s passionate defense of his practice fully working model estimated with the mod- and his steadfast application of the MC ern formal statistical techniques associated model to measure a range of phenomena with the Cowles Commission.2 In sharp con- by themselves justify acquiring Estimating trast, Klein and Goldberger had that model: How the Macroeconomy Works by all econo- mists interested in aggregate models. It was 1 I guess I am too young to have had a chance to no accident that, even before the editor of obtain the very earliest volumes of the series, A Model of the journal contacted me with the sugges- Macroeconomic Activity (1974 and 1976), which went out tion of writing this review, I had already of print decades ago. 2 Much of the material in this section is described in bought the book and placed it with its two considerable detail in Ronald G. Bodkin, Klein, and Kanta older brothers, Specification, Estimation, Marwah (1991). Fernádez-Villaverde: Horizons of Understanding 687 elegant, ­concise with twenty endogenous model, which inaugurated the era of team variables and fifteen structural equations, efforts in the formulation of macroeconomet- relatively easy to grasp in structure (although ric models and generated important technol- not in its dynamics), and with an auspicious ogy in terms of construction, solution, and empirical performance. implementation of models; the Wharton and The excitement of the research agenda was DRI models (which opened macroeconomet- increased with the companion monograph by ric models to commercial exploitation); the Goldberger, Impact Multipliers and Dynamic Bureau of Economic Analysis (BEA) model; Properties of the Klein–Goldberger Model, or the MPS model, which we will discuss published in 1959, and the paper by Irma below. A vibrant future seemed to lie ahead. Adelman and Frank L. Adelman (1959). Goldberger’s book computed the multipliers 3 Death . . . of the model for endogenous and exogenous changes in tax policy, both at impact and Contrary to expectations, the decades over time. Even more pointedly, Goldberger of the 1970s and 1980s were not kind to demonstrated the model’s superior forecast- large-scale aggregate models. Blows against ing ability for the years 1953–55 in compari- them came from many directions. To keep son with a battery of naive models (random the exposition concise, I will concentrate on walks in levels and in differences) proposed three groups of challenges: those complain- by and other researchers ing about the forecasting properties of the less convinced of the potentialities of the models, those expressing discomfort with model. Therefore, Goldberger’s contribu- the econometric justifications of the models, tion compellingly illustrated how the Klein– and those coming explicitly from economic Goldberger model was a productive tool for theory. policy evaluation and forecasting. In an early display of skepticism, Charles The remaining leg of the trinity of analysis, R. Nelson (1972) used U.S. data to estimate policy evaluation, and forecasting was cov- ARIMA representations of several key mac- ered by Adelman and Adelman. In an inno- roeconomic variables, such as output, infla- vative study, the Adelmans reported that the tion, unemployment, or interest rates. Nelson Klein–Goldberger model, when hit by exog- documented that the out-of-sample forecast enous shocks, generated fluctuations closely errors of the ARIMA equations were sys- resembling the cycles in the U.S. economy tematically smaller than the errors gener- after the Second World War. The match was ated by the MPS model of the U.S. economy not only qualitative but also quantitative in (originally known as the FRB–MIT–PENN terms of the duration of the fluctuations, the model, Bodkin, Klein, and Marwah 1991). relative length of the expansion and contrac- Nelson’s findings were most disappoint- tions, and the degree of clustering of peaks ing. Simple ARIMA models were beating and troughs. the MPS model, a high mark of the Cowles Spurred by the success of Goldberger and Commission approach. To make things worse, Klein, many groups of researchers (several of the MPS was not a merely academic experi- them led by Klein himself at the University ment. In its evolving incarnations, the MPS of Pennsylvania) built increasingly sophis- model was operationally employed by staff ticated models during the 1960s and early economists at the Federal Reserve System 1970s that embodied modern versions of from the early 1970s to the mid 1990s (see the IS–LM paradigm. Examples include Flint Brayton et al. 1997). Here we had a state- the postwar quarterly model; the Brookings of-the-art MPS model, with all the structure 688 Journal of Economic Literature, Vol. XLVI (September 2008) incorporated in its behavioral equations in itself), but mainly because it implied that (around sixty in the early 1970s) and its extra the models were not adequate for policy variables, and yet it could not hold its own analysis. Relations estimated under one pol- against an ARIMA. What hope could we have icy regime were not invariant to changes in then that the intellectual bet of building an the regime. Consequently, the models were aggregate macro model would pay off? The unsuited to the type of exercises that econo- message of Nelson’s paper was strongly rein- mists and policymakers were actually inter- forced by the apparent inability of the exist- ested in. Lucas’s argument was nothing but ing econometric models to account for the a particular example of a much more general economic maladies of the 1970s (although, to reexamination of the relation between mod- be fair, the news of the empirical failures of els and data brought about by the rational macroeconometric models during the 1970s expectations (RE) revolution. were grossly exaggerated). Some economists have contended that the A second strike arose directly from within empirical relevance of the Lucas critique has econometric theory. During the late 1970s, been overstated. For instance, Sims (1980) there was a growing dissatisfaction with the claimed that regime changes were rare and casual treatment of trends and nonstationari- that most policy interventions were done ties that had been unfortunately widespread within the context of a given regime (see, among large-scale Keynesian models. The also, the formalization of this idea by Eric N. explosion of research on unit roots and coin- Leeper and Tao Zha 2003). Thus, they were tegration following the papers of Clive W. J. suitable for analysis with a reduced-form Granger (1981), Nelson and Charles I. Plosser empirical model. However, those counterar- (1982), and Peter C. B. Phillips (1987) high- guments missed the true strength of Lucas’s lighted how much of the underpinnings of insight within the professional discourse. By time series analysis had to be rethought. uncovering the flaws of large-scale structural But the econometric problems did not stop models, Lucas destroyed the meta-narrative at this point. Christopher A. Sims (1980), in of the Cowles Commission tradition of model his famous article on Macroeconometrics building. Suddenly, econometric models were and Reality, asserted that the troubles in not the linchpin of modern macroeconomics, the econometric foundation of the conven- the theoretical high ground that ambitious tional models ran even deeper. Elaborating researchers dreamt of conquering and sub- on a classic article by Ta-Chung Liu (1960), jugating. No, macroeconometric models had Sims forcefully argued that the style in which become one additional tool, justified not by economists connected models and data was emphatic assertions of superiority, but by so unfitting that it invalidated their venerable modest appeals to empirical performance identification claims. Instead, he proposed and reasonability. But these were moderate the estimation of vector autoregressions qualities and moderation rarely rallies young (VARs) as flexible time series summaries of researchers. the dynamics of the economy. Finally, a third and, for many researchers, 4 . . . And Transfiguration fatal blow was the Lucas critique. Robert E. Lucas (1976) famously criticized the lack of Within this widespread sense of disap- explicit microeconomic foundations of the pointment with the existing situation, Finn macroeconometric models, not because it E. Kydland and Edward C. Prescott’s (1982) disconnected the models from standard eco- prototype of a real business cycle (RBC) nomic theory (which was certainly a concern model held an irresistible allure. Kydland Fernádez-Villaverde: Horizons of Understanding 689 and Prescott outlined a complete rework of return of as a methodology to how to build and evaluate dynamic mod- take DSGE models to the data. Calibration els. Most of the elements in their paper had was always the shakiest of all the compo- been present in articles written during the nents of the Kydland and Prescott’s agenda. 1970s. However, nobody before had been By rejecting statistical tools, their approach able to bring them together in such a forceful attempted to push economics back to an era recipe. In just one article, the reader could before Haavelmo. Matching moments of the see a dynamic model, built from first princi- data without an explicit formalism is, once ples with explicit descriptions of technology, you dig beyond rhetoric, nothing except a preferences, and equilibrium conditions, a method of moments carelessly implemented. novel solution approach, based more on the Borrowing parameter values from microeco- computation of the model than on analytic nomics forgets that parameters do not have derivations, and what appeared to be a radi- a life of their own as some kind of platonic cal alternative to econometrics: calibration. entity. Instead, parameters have meaning Kydland and Prescott presented calibration only within the context of a particular model. as a procedure for determining parameter Worst of all, there was no metric to assess the values by matching moments of the data and model’s performance or to compare it with by borrowing from microeconomic evidence. that of another competing model, beyond a The model’s empirical performance was collection of ad hoc comments on the simu- evaluated by assessing the model’s ability to lated second moments of the model. reproduce moments of the data, mainly vari- Nevertheless, calibration enjoyed a period ances and covariances not employed in the of prosperity and for good reasons. First, calibration. as recalled by George W. Evans and Seppo After Kydland and Prescott, dozens of Honkapohja (2005), the early results on the researchers jumped into building RBC estimation of RE models were heartbreaking. ­models, embracing computation and cali- Model after model was decisively rejected by bration as the hallmarks of modern macro- likelihood ratio tests, despite the fact that economics. The basic model was explored, many of these models seemed highly prom- extended, and modified. Instead of the origi- ising. Calibration solved this conundrum by nal productivity innovation, a multitude of changing the rules of the game. Since it was shocks began to appear: shocks to prefer- no longer important to fit the data but only ences, depreciation, household productions, a small subset of moments, researchers had etc. Instead of a Pareto-efficient economy, a much lower hurdle to overcome and they economists built models with plenty of could continue their work of improving the imperfections and room for an activist fis- theory. Second, in the 1980s, macroecono- cal and monetary policy. Instead of a single mists had only primitive solution algorithms agent, they built models with heterogene- to approximate the dynamics of their mod- ity of preferences, endowments, and infor- els. Furthermore, those algorithms ran on mation sets. Even the name by which the slow and expensive computers. Without models were known changed and, by the efficient and fast algorithms, it was impos- late 1990s, they started to be known by the sible to undertake likelihood-based infer- tongue twister dynamic stochastic general ence (although this difficulty did not affect equilibrium (DSGE) models. the generalized method of moments of Lars Among all of these changes, the one that Peter Hansen 1982). Finally, even if econo- concerns us the most here was the slow, nearly mists had been able to compute RBC models imperceptible, but ultimately unstoppable efficiently, most of the techniques required 690 Journal of Economic Literature, Vol. XLVI (September 2008) for estimating them employing a likelihood from the households. In the same way work- approach did not exist or were unfamiliar to ers do, the intermediate good producers face most economists. the constraint that they can only change But the inner logic of the profession, with its prices following a Calvo’s rule. Finally, the constant drive toward technical positioning, monetary authority fixes the one-period would reassert itself over time and, during the nominal through open market 1990s, the landscape changed dramatically. operations with public debt. Other standard There were developments along three fronts. features of this model include varying capac- First, economists learned to efficiently com- ity utilization of capital, investment adjust- pute equilibrium models with rich dynamics. ment costs, taxes on consumption, labor, and There is not much point in estimating stylized capital income, monetary policy shocks, and models that do not even have a remote chance quite often, an international economy sector of fitting the data well. Second, statisticians (as described in Stephanie Schmitt-Grohé developed simulation techniques like Markov and Martín Uribe 2003). chain Monte Carlo (McMc), which we require Given their growing complexity, these to estimate DSGE models. Third, and per- ­models with real and nominal rigidities are haps most important, computers became so also sometimes known as medium-scale cheap and readily available that the numeri- DSGE models, although some of them, run- cal constraints were radically eased. By the ning up to eighty or ninety stochastic equa- late 1990s, the combination of these three tions (as many as early versions of the MPS developments had placed macroeconomists model, for example), hardly deserve the title. in a position not only where it was feasible to Since these models are particularly conge- estimate large DSGE models with dozens of nial for monetary analysis, numerous cen- equations, but where it was increasingly dif- tral banks worldwide have sponsored their ficult to defend any reasonable alternative to development and are actively employing formal inference. them to help in the formulation of policy. And indeed DSGE models began to be Examples include the models by the Federal estimated. A class of DSGE models that has Reserve Board (Rochelle M. Edge, Michael enjoyed a booming popularity both in terms T. Kiley, and Jean-Philippe Laforte 2007), of estimation and policy applications is the the European (Frank Smets one that incorporates real and nominal rigid- and Raf Wouters 2003 and Kai Christoffel, ities. The structure of this type of model, Günter Coenen, and Anders Warne 2006), as elaborated, for example, by Michael the Bank of Canada (Stephen Murchison Woodford (2003), is as follows. A continuum and Andrew Rennison 2006), the Bank of of households maximizes their utility by con- England (Richard Harrison et al. 2005), the suming, saving, holding real money balances, Bank of Finland (Juha Kilponen and Antti supplying labor, and setting their own wages Ripatti 2006 and Mika Kortelainen 2002), subject to a demand curve and Calvo’s pric- the Bank of Sweden (Malin Adolfson et al. ing (e.g., the households can reoptimize their 2005), and the Bank of Spain (Javier Andrés, wages in the current period with an exoge- Pablo Burriel, and Ángel Estrada 2006), nous probability; otherwise they index them among several others. to past ). A competitive producer With medium-scale models, macroecono- manufactures the final good by aggregating a mists have responded to the three challenges continuum of intermediate inputs. The inter- listed in section death. With respect to fore- mediate inputs are produced by monopolis- casting, and despite not having been designed tic competitors with capital and labor rented for this purpose, DSGE models do a reason- Fernádez-Villaverde: Horizons of Understanding 691 able job of forecasting at short run horizons ignore existing accomplishments. And third, and a remarkable one at medium horizons. because intellectual diversity is key to main- And they do so even when compared with flex- taining a vibrant econometric community. ible reduced-form representations like VARs Fair’s MC model is composed of two sec- (as reported for the Euro Area-Wide model tions (or submodels): the U.S. model and the of the European Central Bank by Christoffel, rest of the world model. The U.S. model has Coenen, and Warne 2007). Second, many of six sectors: households, firms, financial, for- the models explicitly incorporate unit roots eign, federal government, and state and local and the resulting cointegrating relations governments. Their behavior is captured implied by the balanced growth properties of through thirty stochastic equations (nine for the neoclassical growth model at the core of the household sector, twelve for firms, five the economy. These aspects receive an espe- for the financial sector, one for imports, and cially commendable treatment in Edge, Kiley, three for government). In addition, there are and Laforte (2007). Third, modern DSGE 101 accounting identities. This makes the models are estimated using the equilib- U.S. model one of moderate size, especially rium conditions of the economy, eliminating if we compare it with other models in the the requirement of specifying exclusionary Cowles Commission tradition. For instance, restrictions that Sims objected to (although, the Washington University macro model unfortunately, not all identification problems marketed by Macroeconomic Advisers for are eliminated, as documented by Nikolay commercial applications has around 442 Iskrev 2007). Fourth, since we estimate the equations.3 parameters describing the preferences and The rest of the world model is composed technology of the model, the Lucas critique of thirty-eight countries for which struc- is completely avoided. Thus, it looks like we tural equations are estimated (thirteen with are back where we thought we were at the quarterly data and twenty-five with annual late 1960s: with good and operative aggre- data) and twenty countries for which only gate models ready for prime time. But are we trade share equations are estimated. For the really there? thirty-​eight countries with structural equa- tions, Fair estimates up to fifteen stochastic equations, the same for all of the countries 5 Estimating How the Macroeconomy but with possibly different coefficients. Works in Context Then, the United States and the rest of the Fair’s book takes a more skeptical view of world components are linked through a set the ability of modern DSGE models—not of equations that pin down trade shares and directly, as it does not discuss those models world prices (which generates an additional in detail, but implicitly, through its adher- equation for the U.S. model). When we put ence to the traditional Cowles Commission together the two sections of the MC model, approach. Fair is to be applauded for his we obtain a total of 362 stochastic equations, position: first, and foremost, because there 1,646 estimated coefficients, and 1,111 trade is much of value in the Cowles Commission share equations. framework that is at risk of being forgotten. The basic estimation technique of the His books may play a decisive role in the model’s equations is two-stage least squares transmission of that heritage to newer gen- (2SLS). The data for the United States begin erations of researchers. Second, because we work in a profession where there is a strong 3 See http://www.macroadvisers.com/content/WUMM tendency to embrace the latest fad and to 2000_overview.pdf, accessed on January 7, 2008. 692 Journal of Economic Literature, Vol. XLVI (September 2008) in 1954 and as soon as possible for other the accompanying documentation, memos, countries, and ends between 1998 and 2002. and papers. The interested researcher can Fair also performs an extended battery of also download the Fair–Parke program used tests for each equation, including tests for for computations of the model: point estima- additional variables, for time trends, for tion, bootstrap, forecast, simulation, and opti- serial correlations of the error term, and for mal control. The software is available not only lead variables, or Donald W. K. Andrews and in an executable form, but also as the source Werner Ploberger (AP) (1994) stability tests code in FORTRAN 77, which is a treat for among others. Finally, there is some testing those economists concerned with numerical of the complete model, basically by examin- issues. ing the root mean squared error between the 5.1 National Income and Product Accounts predicted and actual values of the variables, and exercises in optimal control using cer- The first example of the important lessons tainty equivalence. from the Cowles Commission approach I The spirit of the estimated equations is want to discuss is the need for a thoughtful firmly rooted in the Keynesian tradition treatment of the relation between National and allows for the possibility of markets not Income and Product Accounts (NIPA) and clearing. For example, consumption depends the definitions of variables in the model. on income, an interest rate, and wealth. Jacob Marshack, when he organized the work Investment depends on output and an inter- of the Cowles Commission at University of est rate. Prices and wages are determined Chicago in 1943–44, considered that data based on labor productivity, import prices, preparation was such an important part of unemployment, and lagged prices and wages. the process that he put it on the same level as Besides, the model accounts for all balance- model specification and statistical inference. sheet and flow-of-funds constraints, as well NIPA are a complicated and messy affair. as for demographic structures. The task in front of statisticians is intimidat- Instead of spending time on a detailed ing: to measure the production, income, and explanation of the model and the different inputs of a large, complex economy, with thou- applications, I will outline several aspects of sands and thousands of ever-changing prod- Fair’s work that I find of interest, particularly ucts and with active trading with the rest of in relation to the practice of other research- the world. To make things worse, the budget ers. Nevertheless, the reader is invited to allocated to national statistics institutes like consult Fair’s book and his web page (http:// the BEA is awfully limited (the president’s fairmodel.econ.yale.edu/main2.htm,) for the budget request for the BEA for fiscal year details of the model. One of the most out- 2008 was $81.4 million). Consequently, the standing features of Fair’s work has always BEA needs to make numerous simplifying been his disclosure policy, a true gold-stan- assumptions, implement various shortcuts, dard for the profession. The book provides an and rely upon surveys with sample sizes that extremely detailed description of the model, are too small for the task at hand. with all of the equations and data sources and But even if we gloss over the data collec- definitions. On Fair’s web page, the visitor can tion difficulties, there is the open question of obtain the latest updates of the model4 and the mapping between theory and measure- ment. The objects defined in our models are not necessarily the objects reported by 4 All operational macroeconometric models are living entities. They require constant care and incorporation of the BEA. For instance, the BEA computes new information. real output by deflating nominal ­output by Fernádez-Villaverde: Horizons of Understanding 693 a deflator that weights consumption, invest- This point is seen more clearly with a ment, government consumption, and net simple example. Imagine that we have a lin- exports. However, following the work of earized benchmark RBC model with a pro- Jeremy Greenwood, Zvi Hercowitz, and Per ductivity and a demand shock. Since we have Krusell (1997 and 2000), investment-specific two shocks, we can account for two observ- technological change has become a central ables. Let us suppose that the econometrician source of long run growth and aggregate chooses instead as her observables three vari- fluctuations in many estimated DSGE mod- ables: consumption, hours, and investment. els. Investment-specific technological change Then, it is easy to show that investment in this induces a relative price for capital in terms of model has to be equal to a linear combination consumption that is not trivially equal to one of consumption and hours that does not have as in the standard neoclassical growth model. a random component (more generally, any of This implies that in these models, to be con- the three observables is a deterministic lin- sistent with the theory, we need to define ear combination of the other two). Hence, real output as nominal output divided by the any observation for investment that is dif- consumption deflator. In this and many other ferent from the previous linear combination cases, the econometrician has to readjust the cannot be predicted by the model. Since, in data to suit her requirements. Pages 180 to reality, investment data do not neatly fall into 188 of the book and the 101 identities in the the linear function of consumption and hours model (equations 31 to 101) show how, over imposed by the model, we will compute a the years, Fair has done an incredibly heed- likelihood of –∞ with probability 1. ful job of mapping theory and data. Even if the number of shocks in medium- In a sad comparison, sometimes DSGE scale DSGE models has grown to a number macroeconometric models are not nearly as between ten and twenty, we are still quite thorough in their treatment of NIPA data. far away from the 248 variables of the U.S. Moreover, and most unfortunately, extended model in Fair’s book. Since it seems plausible discussions of NIPA data are often frowned that additional data will bring extra informa- upon by editors and referees, who ask either tion to the estimation, the use of a large num- they be cut or sent to remote appendices. It is ber of macro variables may be an important not clear to me the value of employing a state- advantage of the MC model. of-the-art econometric technique if we end How can we bridge the distance between up feeding it with inferior measurement. the Cowles Commission models and the DSGE models in terms of variables used 5.2 Number of Macro Variables for the estimation? One simple possibility The Cowles Commission simultaneous would be to increase the number of shocks equations models employ many more data in the model. However, even with around series than the new DSGE models. The dif- twenty structural shocks, as in the richest ference is due to the problem of stochastic of the contemporary DSGE models, we are singularity. A linearized, invertible DSGE already straining our ability to identify the model with N shocks can only non trivially innovations in the data. Worse still, the eco- explain N observables. All the other observ- nomic histories that we tell to justify them ables are linear combinations of the first N are becoming dangerously thin. variables. Consequently, any observation of a A second approach is to introduce mea- variable not included in the first N observ- surement error in the observed variables. As ables that is different from the one predicted we argued before, there are plenty of sources by the model implies a likelihood of 2`. of noise in the data that can be plausibly 694 Journal of Economic Literature, Vol. XLVI (September 2008) modeled as measurement error (see Thomas hours? Theory is not a guide for the selec- J. Sargent 1989 for an exposition of this argu- tion of the variables, or at best, a weak one. ment). However, introducing noise in the Pablo Guerrón (2007) has documented how observables through measurement error is the use of different combinations of observ- dangerous. We risk falling into a situation ables in the estimation of DSGE models has where the measurement error is explaining a large effect on inference. These problems the dynamics of the data and the equilibrium would be avoided if we could use a large data model is a nuisance in the background of the set for the estimation of the model. estimation. In addition, measurement errors 5.3 Rational Expectations complicate identification. An intriguing alternative is to follow the Fair does not assume that expectations proposal of Jean Boivin and Marc Giannoni are rational. In fact, it is one of the very first (2006). Boivin and Giannoni start from the things he tells the reader (as early as page 4): observation that factor models have revealed It is assumed that expectations are not ratio- how large macro data sets contain much nal. Instead, he includes lagged dependent information about the paths of series like variables as explanatory variables in many output or inflation (James H. Stock and Mark of the equations and finds that they are sig- W. Watson 1999, 2002). This suggests that nificant even after controlling for any autore- the standard assumption that the observables gressive properties of the error term. These derived from the DSGE model are properly lagged variables may capture either partial measured by a single indicator such as gross adjustment effects or expectational effects domestic output is potentially misleading. that are not necessarily model consistent, Instead, Boivin and Giannoni exploit the rel- two phenomena that are hard to separate evant information from a data-rich environ- using macro data. ment by assuming that the variable implied by Fair recognizes that his choice puts him the model is not directly observed. Instead, in the minority in the profession, but that he it is a hidden common factor for a number sees several advantages in it. First, RE are of observables that are merely imperfect cumbersome to work with. Without them, one indicators of the exact variables. Boivin and can safely stay within the traditional Cowles Giannoni show that their idea delivers an Commission framework, since the Lucas cri- accurate estimation of the model’s theoreti- tique is unlikely to hold. Second, Fair does cal variables and shocks. Furthermore, the not consider it plausible that enough peo- estimates imply novel conclusions about key ple are so sophisticated that RE are a good structural parameters and about the sources approximation of actual behavior. Third, and of economic fluctuations. related to the previous point, he finds in his Boivin and Giannoni’s work also helps to estimates a considerable amount of evidence overcome another difficulty of DSGE mod- against RE and little in its favor. els. We mentioned before that because of sto- The first objection is intimately linked with chastic singularity, we can use only a reduced the fact that dynamic equilibrium models number of variables in the estimation. This with RE rarely imply closed-form solutions. raises the question of which variables to We need to resort to the computer to obtain select for the estimation. Imagine that we numerical approximations, which have prob- have a model with three shocks. Thus, we can lems and limitations of their own. For exam- employ at most three variables. But which ple, we can rarely evaluate the exact likelihood ones? Should we pick output, consumption, of the model but only the likelihood of the and hours? Or investment, consumption, and numerically approximated model. This causes Fernádez-Villaverde: Horizons of Understanding 695 problems for estimation (Jesús Fernández- ­structural change or from stochastic volatil- Villaverde, Juan F. Rubio-Ramírez, and ity in the innovations that drive the dynamics Manuel S. Santos 2006). Also, RE impose a of the economy, as emphasized by Thomas A. tight set of cross-equation restrictions that Lubik and Paolo Surico 2006). are difficult to characterize and that limit our Fair’s second objection to RE—that a intuitive understanding of the model. majority of people are not sophisticated However, computational complexity is enough to have model consistent expecta- less of an issue nowadays. Recently, impor- tions—is more appealing and intuitive. It also tant progress has been made in the solu- opens important questions for macroecono- tion and estimation of models with RE. metricians. One is the role of heterogeneity Macroeconomists have learned much of beliefs in macro models. If expectations about projection and perturbation meth- are not rational, there is no strong rea- ods (Kenneth L. Judd 1992, Judd and son why we should expect that beliefs are Sy-ming Guu 1992 and S. Borag�an Aruoba, homogeneous across agents. But, then, we Fernández-Villaverde, and Rubio-Ramírez need to think about how this heterogeneity 2006). Problems that looked intractable a few aggregates into allocations and prices. For years ago are now within the realm of fea- instance, we could have a situation where the sibility. In particular, perturbation allows us only beliefs that matter for macro variables to handle DSGE models with dozens of state are the ones of the marginal agent. This situ- variables and obtain nonlinear solutions of ation resembles the results in finance where the desired accuracy. Moreover, also thanks the pricing of securities is done by the mar- to perturbation, we can easily solve non lin- ginal agent. If this agent is unconstrained, ear optimal control problems in these models the existence of widespread borrowing con- without imposing certainty equivalence (see, straints or other market imperfections has a for example, the Ramsey policy design in negligible quantitative impact. Equally, with Schmitt-Grohé and Uribe 2006). belief heterogeneity, it could be the case that Estimation methods have also advanced the only expectations that matter are those quickly, and we have powerful new tools of a small subset of agents that have RE or such as McMc and sequential Monte Carlo something close to it. (SMC) algorithms that allow us to handle, This intuition is explored in many papers. through simulation methods, extremely chal- Lawrence Blume and David Easley (2006) lenging estimation exercises (see Sungbae An present a particularly striking finding. They and Frank Schorfheide 2007 for McMc and show that in any Pareto-optimal allocation Fernández-Villaverde and Rubio-Ramírez with complete markets, all consumers will 2005 and 2007 for an introduction to SMC see their wealth vanish except the one whose algorithms). beliefs are closest to the truth. Of course, Therefore, one could argue that it is more there are many caveats. We require Pareto fruitful to solve directly for the equilibrium optimality, the theorem deals only with of the model and derive the estimating equa- wealth distributions (and not, for instance, tion implied by it, instead of searching for with labor supply), and the findings charac- empirical structures. In that way, we avoid terize the limit behavior of these economies.5 encounters with the Lucas critique indepen- dently of whether or not it is empirically rel- evant (a tough question in itself because of 5 And the long run may be far away. Think of the case where the divergent beliefs are about a low frequency the difficulty in separating regime changes feature of the data. It may take an inordinately long time in the estimation from other sources of before the beliefs have an impact on wealth distributions. 696 Journal of Economic Literature, Vol. XLVI (September 2008)

Regardless of one’s view of the ­empirical assumptions in the model. Imagine, for ­relevance of Blume and Easley’s result, example, that we have a model like Robert E. departing from RE leads us quite naturally Hall’s (1978) where RE imply that consump- to heterogeneity of beliefs and to the need to tion follows a random walk. If the data reject develop econometric techniques to incorpo- this prediction, it may be either because rate this heterogeneity. RE fail, because households face borrow- A second question arising from the depar- ing constraints, or because the parametric ture from RE is the modeling of learning. form of the utility function is misspecified. A classical defense of RE is the idea that Disentangling all these possibilities requires, relatively simple learning algorithms such at best, considerable ingenuity, and at worst, as least-squares converge to the RE equilib- it may not be feasible. Finally, RE appear in rium (see Albert Marcet and Thomas Sargent the model through complicated cross-equa- 1989). Perhaps not surprisingly, this classical tion restrictions. Because of this, it is not defense is easily broken in several contexts immediately obvious why it is possible in where we have self-confirming equilibria general to test for RE by adding lead values (SCE) in which the agents have the wrong of variables in the equations of the model, beliefs about the world but there is noth- as argued in the book. On many occasions, ing in the data that forces them to change the solution of the DSGE model implies that their subjective distributions. One of the only state variables, and not any lead vari- most remarkable facts about SCE is how able, appear in the relevant equation for an easy it is to build examples where SCE exist. observable. Testing for the presence of that Disappointingly, as in the case of heteroge- lead variable then tells us that this variable neous beliefs, it seems that macroeconome- is not significant despite RE being present in tricians have not investigated this issue in the data. detail (see, however, the pioneering work 5.4 Treatment of Nonstationarities of Giorgio E. Primiceri 2006 and Sargent, Noah Williams, and Zha 2006). A second point of departure of Fair’s MC All of which brings us back to the ques- model from much of the literature is the tion of how to assess the importance of the treatment of nonstationarities. Fair observes departure from RE, an issue not terribly well that it is exceedingly difficult to separate sto- understood. Krusell and Anthony A. Smith chastic from deterministic trends and that, (1998) demonstrated, in the context of com- consequently, it seems logical to follow the puting models with heterogeneous agents, easiest route. Moreover, the point estimates that complicated forecasting rules can be will not be seriously affected, and the stan- approximated with fantastic accuracy by dard errors, whose asymptotic approximation simple OLS regressions. Thus, even if the breaks down in the nonstationary case, can agents do not have RE, it is conceivable that always be approximated by the bootstrap. their behavior in certain models may be well Hence, the working hypothesis of the book is described by RE. Since assuming RE often that variables are stationary around a deter- makes our life easier by eliminating free ministic trend. parameters, we may as well do so. It is easy to be sympathetic to Fair’s prag- Fair’s final point deals with the empirical matism. Much work went into the study of evidence on RE. This argument requires nonstationarity during the 1980s and 1990s. a nuanced assessment. Testing for RE is a At the end of the day, many researchers felt challenging task. Commonly, the proposed (paraphrasing Lawrence J. Christiano and tests are really joint tests of RE and other Martin Eichenbaum’s 1990 article) that we Fernádez-Villaverde: Horizons of Understanding 697 do not know and probably we do not care time trend (T) in the second version: a con- that much. It is even harder to see why stant, the percent of population over sixteen nature should have been so kind to econome- that is between twenty-six and fifty-five years tricians as to have neatly divided time series old minus the percent between sixteen and into those with deterministic trends and twenty-five (AG1), the percent of population those with unit roots. More complex speci- over sixteen that is between fifty-six and sixty- fications, such as fractional integration and five years old minus the percent between other long memory processes, are most likely sixteen and twenty-five (AG2), the percent a better description of the data (see Peter M. of population over sixteen that is older than Robinson 2003).6 sixty-six minus the percent between sixteen I have only two small arguments to add to and twenty-five (AG3), the lag log of per Fair’s well-constructed argument. First, sim- capita real services consumption, real per plicity sometimes runs in favor of unit roots. capita disposable income (YD/(POP * PH)), Often, the presence of a unit root allows us and the after-tax bill rate (RSA). The point to rewrite the model in a more convenient estimates (with the t-statistics in parenthesis) parameterization. This is the case, for exam- are reproduced in table 1. ple, in DSGE models with long run growth, We see, for instance, how the coefficient where the presence of a stochastic trend on AG2 has gone from being 0.2293, a posi- simplifies derivations. Pragmatism can now tive (but not significant) value that was inter- be employed to justify the use of unit roots. preted as supportive of the life cycle theory Second, one may want to be careful when of consumption, to 20.3907, a negative and designing and evaluating a bootstrap with significant value that seems more difficult to respect to the way in which nonstationarity reconcile with standard consumption theo- appears in the model since it will have an ries. Maybe more relevant, the coefficient impact on the simulation efficiency. on lag services consumption has fallen from 0.9425 to 0.7873, a considerable reduction in 5.5 Stability of Estimates the persistence of the variable over time. Nonstationarities are not the only source Fair fully recognizes this feature of the of change over time that researchers should equation by using an AP stability test to detect be concerned about. One aspect that I always a break in the late 1970s. The break (perhaps found salient in the data is parameter drift- a manifestation of a smoother change in the ing. Fair’s results seem to uncover strong parameters over time) may be interpreted as evidence of this variation in parameters over an indication of changes in the underlying time. To illustrate this, let us examine equa- structure of the economy, as the reaction to tion 1 in the model, which explains the log policy regime changes, or as a sign of mis- of consumer durables (CS) in real per capita specification. In fact, detecting changes in terms (divided by POP). Both in the version coefficients is not a problem of the model, but in Testing Macroeconometric Models (1994) a proof of the usefulness of employing econo- and in the version in Estimating How the metrics to organize our thinking about the Macroeconomy Works (2004), the right-hand- data. Interestingly, there is also considerable side (RHS) variables are the same except for evidence of parameter drifting when we esti- total net wealth per capita (AA/POP) and a mate DSGE models (Fernández-Villaverde and Rubio-Ramírez 2008). Another possibility behind the changes 6 This observation raises the issue of why more struc- tural models are not estimated using more flexible sto- in parameters over time may be the pres- chastic processes for shocks. ence of stochastic volatility. There is ­growing 698 Journal of Economic Literature, Vol. XLVI (September 2008)

Table 1 Equation 1, log (CS/POP)

RHS Variable Estimate (1954:1–1993:2) Estimate (1954:1–2002:3) constant 0.087 (2.17) 0.0572 (1.48) AG1 ­20.2347 (22.86) 20.3269 (24.40) AG2 0.2293 (0.99) –0.3907 (22.91) AG3 0.2242 (1.14) 0.7687 (4.89)

log (CS/POP)21 0.9425 (29.58) 0.7873 (19.31) log (YD/(POP*PH)) 0.057 (1.88) 0.1058 (25.75) RSA 20.0009 (23.93) 20.0012 (5.75) log (AA/POP)­21 — 0.0172 (3.50) T — 0.0004 (4.42)

­evidence that time-varying volatility is key to This growing popularity of Bayesian meth- understand the evolution of the U.S. econ- ods is one of the newest aspects of the revival omy (Margaret M. McConnell and Gabriel of macroeconometrics that we described in Pérez-Quirós 2000 and Sims and Zha 2006). section 4. Researchers have been attracted The most important phenomenon discov- to the Bayesian paradigm for several reasons. ered by this literature has been the big drop First, Bayesian analysis is a comprehensive in observed volatility of macroeconomic framework for inference based on decision aggregates in the United States over the last theory. This foundation allows a smooth tran- several decades. Stock and Watson (2003) sition from the results of the model to policy have coined the term the great moderation advice. Second, the Bayesian approach is to refer to this observation. Even if least- well suited to deal with misspecification and square type estimates will still be consistent lack of identification, both important prob- in the presence of heteroskedasiticity of the lems in DSGE modeling (Fabio Canova and shocks, the standard errors will be mislead- Luca Sala 2006; Iskrev 2007). Furthermore, ing. Moreover, the small sample distributions Bayesian estimators perform well in small of the estimates will cause an undue instabil- samples and have desirable asymptotic prop- ity of the parameters. The only way to test erties, even when evaluated by classical whether we face parameter drifting or sto- criteria (Fernández-Villaverde and Rubio- chastic volatility of the shocks (or a combina- Ramírez 2004). Third, priors allow the intro- tion of both) is to estimate models with both duction of presample information, which is characteristics and hope that the data are often available and potentially informative. sufficiently informative to tell us about the Finally, Bayesian methods are often numeri- relative importance of each channel. cally more robust than classical ones such as maximum likelihood. 5.6 Classical Estimates versus the Bayesian Of course, there are also problems with the Approach Bayesian approach. Many economists express Fair’s work is firmly positioned within discomfort about the use of priors and about the classical framework for inference. Most the cost of eliciting them in highly param- econometricians will find his approach famil- eterized models. Extensive prior robustness iar and intuitive. This classical heritage con- analysis is rarely conducted. Hence, it is trasts with much of the estimation of DSGE often hard to assess the influence of priors models, which has been done predominantly in final estimates. Second, even the most from a Bayesian perspective. robust numerical methods may hide subtle Fernádez-Villaverde: Horizons of Understanding 699

­convergence problems. Finally, Bayesian ity just because we do not know how to do it thinking is still unfamiliar to many econo- otherwise. Hopefully, the next few years will mists, making it difficult to transmit substan- bring improvements along this front. tive results in an efficient way. 5.8 The Forecasting Record of the MC Given the different approaches to infer- Model versus Alternatives ence, a natural exercise would be to rees- timate Fair’s model with Bayesian priors One of the most staggering aspects of (although we would need to take care to Fair’s work over time has been his unrelent- not be too influenced in the choosing of the ing determination to test his model, both in priors by what we already know from 2SLS absolute terms, and in comparison with exist- estimates). Then, we could assess the extent ing empirical alternatives. to which the estimates are different and A natural benchmark test is the model’s whether the model’s empirical performance forecasting record. We can go back in time increases or decreases. Such an experiment and see how well the model accounts for contrasting methodologies could be most the observations, both in sample and out- instructive for both classical and Bayesian side sample. However, before doing that and researchers and for policy makers interested since we are dealing with aggregate data, the in the use of macroeconometric models. researcher needs to take a stand in relation to the revisions of the data and methodological 5.7 Microeconomic Data changes. Statistical agencies are constantly An important feature of the MC model is revising data, both to incorporate further the attempt to incorporate microeconomic information (preliminary data release, first data. For example, the demographic struc- data release, second release, the first major ture of the population appears in the con- revision, and so on) and to update their defi- sumption equations. Also, the flow-of-funds nitions to reflect advances in economic the- data are employed to link the different sec- ory and measurement. The issue faced by all tors in the model. macro modelers is how to incorporate those This is another aspect on which the large changes in a consistent way. One possibility, macroeconometric models of the 1960s and followed by Fair (and that, personally, I think 1970s focused much attention and is an area is the most reasonable), is to always use the where DSGE models still have much to learn. most recent vintage of data. This amounts to Most dynamic equilibrium models use a rep- asking the model to account for what we cur- resentative agent framework. In compari- rently think actually happened. son, microeconometricians have emphasized This position, however, faces the difficulty again and again that individual heterogene- that agents may be basing their decisions on ity is the defining feature of micro data (see real-time data. Hence, if we try to understand Martin Browning, Lars Hansen, and James J. their behavior using final data, we may dis- Heckman 1999 for the empirical importance cover that we cannot properly explain their of individual heterogeneity and its relevance actions. We do not know much about the size for macroeconomists). Thus, DSGE models of this bias. Perhaps, at the individual level, need to move away from the basic repre- the effect is not very big because the agents, sentative agent paradigm and include richer to a first approximation, need to know only configurations. Of course, this raises the dif- a few prices (like the wages and the interest ficult challenge of how to effectively estimate rate they face) that are directly observable. these economies. Right now, DSGE models On the other hand, the government may find are estimated without individual heterogene- that using real-time data complicates their 700 Journal of Economic Literature, Vol. XLVI (September 2008) task substantially. Athanasios Orphanides DSGE models in policy making institutions, (2001) has shown that real-time policy rec- this evaluation will help us to gauge the ommendations differed considerably from empirical success of each approach.8 those obtained with ex post revised data dur- ing the 1970s. 6 Concluding Remarks Chapter 14 of the book compares the U.S. model with two alternatives—a seven vari- Economists are inordinately fond of a able VAR and an autoregressive components Whiggish interpretation of our own history (AC) model where each component of out- as a profession. Overfacile opinion in the put is regressed against itself and summed humanities notwithstanding, the temptation to generate forecasted GDP.7 Fair finds that of that teleological reading of our develop- the U.S. model does better than the VAR ment is firmly anchored in the experience of and nearly always better than the AC model. decades of continuous increase in our hori- Nevertheless, there is evidence that the zons of understanding. Macroeconometrics VAR and the AC contain some independent is no exception to this progress. Despite information not included in the U.S. model some retreats in the 1970s and 1980s, we (although this additional information seems now have sounder models: better grounded small). in microeconomics, explicit about equilib- Chapter 14 motivates several additional rium dynamics, and flexible enough to incor- exercises. First, we could include a Bayesian porate a wide array of phenomena of interest. VAR in the set of comparing models. A com- Better still, despite important remaining mon problem of classical VARs is that they are points of contention, the last few years have copiously parameterized given the existing been marked by a growing consensus among data, overfitting within sample and perform- macroeconomists about some fundamental ing poorly outside sample. By introducing outlines of the behavior of the economy. presample information, we can substantially Fair does not fully share such an optimist improve the forecasting record of a VAR. A outlook of the history and status of modern second possibility is to impose cointegrating macroeconomics. He thinks that a return to relations and estimate a vector error correc- some of the programmatic principles of the tion model (VECM). The long-run restrictions Cowles Commission approach to economet- keep the model’s behavior tightly controlled rics is worthwhile and perhaps imperative. and enhance its forecasting capabilities in Some economists will find much to agree the middle-run. Third, we could estimate a with in this book, while others will be less VAR or a VECM with parameter drifting or convinced. But Fair’s position does not make stochastic volatility. Fourth, it would also be his work any less valuable for this second valuable to assess the performance of the rest group of researchers. As I argued above, of the world model. Finally, and in my opin- even those economists more interested in ion the most interesting exercise, would be to the estimation of DSGE models have much compare the forecasting record of the MC to learn from the Cowles Commission tradi- model with that of a modern DSGE model. tion. Fair’s book is an excellent place to do so. Given the increased prominence of estimated After all, both supporters of more traditional models and fans of DSGE models agree about a ­fundamental point: that building and 7 There is a third model, Fair’s U.S. model extended with eighty-five auxiliary autoregressive equations for each of the exogenous variables. I will not discuss this 8 A first but incomplete step in this comparison is done model because of space constraints. in Fair (2007). Fernádez-Villaverde: Horizons of Understanding 701 estimating dynamic models of the aggregate Do We Care?” Carnegie–Rochester Conference economy is possible and important for soci- Series on Public Policy, 32: 7–61. Christoffel, Kai, Günter Coenen, and Anders Warne. ety’s welfare. Coming back to the beginning 2006. “The New Area-Wide Model of the Euro of the review: high modernism still reigns Area: Specification, Estimation Results and Proper- undisputed in economics, and we are all ties.” Unpublished. Coenen, Günter, Kai Christoffel, and Anders Warne. much better off because of it. 2007. “Conditional versus Unconditional Forecast- ing with the New Area-Wide Model of the Euro References Area.” Unpublished. Edge, Rochelle M., Michael T. Kiley, and Jean-Philippe Adelman, Irma, and Frank L. Adelman. 1959. “The Laforte. 2007. “Documentation of the Research and Dynamic Properties of the Klein–Goldberger Statistics Division’s Estimated DSGE Model of the Model.” Econometrica, 27(4): 596–625. U.S. Economy: 2006 Version.” Board of Governors Adolfson, Malin, Stefan Laseén, Jesper Lindé, and of the Federal Reserve System Finance and Eco- Mattias Villani. 2005. “Bayesian Estimation of an nomics Discussion Series 2007-53. Open Economy DSGE Model with Incomplete Pass- Evans, George W., and Seppo Honkapohja. 2005. “An Through.” Sveriges Riksbank Working Paper Series Interview with Thomas J. Sargent.” Macroeconomic 179. Dynamics, 9(4): 561–83. An, Sungbae, and Frank Schorfheide. 2007. “Bayesian Fair, Ray C. 1974. A Model of Macroeconomic Activ- Analysis of DSGE Models.” Econometric Reviews, ity, Volume I: The Theoretical Model. Cambridge, 26(2–4): 113–72. Mass.: Ballinger Publishing. Andrés, Javier, Pablo Burriel, and Ángel Estrada. Fair, Ray C. 1976. A Model of Macroeconomic Activ- 2006. “BEMOD: A DSGE Model for the Spanish ity, Volume II: The Empirical Model. Cambridge, Economy and the Rest of the Euro Area.” Banco de Mass.: Ballinger Publishing. Espana Working Paper 0631. Fair, Ray C. 1984. Specification, Estimation, and Andrews, Donald W. K., and Werner Ploberger. 1994. Analysis of Macroeconometric Models. Cambridge “Optimal Tests When a Nuisance Parameter Is Pres- and London: Harvard University Press. ent Only under the Alternative.” Econometrica, Fair, Ray C. 1994. Testing Macroeconometric Models. 62(6): 1383–1414. Cambridge and London: Harvard University Press. Aruoba, S. Borag�an, Jesús Fernández-Villaverde, and Fair, Ray C. 2007. “Evaluating Inflation Target- Juan F. Rubio-Ramírez. 2006. “Comparing Solution ing Using a Macroeconometric Model.” Eco- Methods for Dynamic Equilibrium Economies.” nomics: The Open-Access, Open-Assessment Journal of Economic Dynamics and Control, 30(12): E-Journal, 1, 2007-8. http://www.economics-ejour- 2477–2508. nal.org/economics/journalarticles/2007-8. Blume, Lawrence, and David Easley. 2006. “If You’re Fernández-Villaverde, Jesús, and Juan F. Rubio- So Smart, Why Aren’t You Rich? Belief Selection in Ramírez. 2004. “Comparing Dynamic Equilibrium Complete and Incomplete Markets.” Econometrica, Models to Data: A Bayesian Approach.” Journal of 74(4): 929–66. Econometrics, 123(1): 153–87. Bodkin, Ronald G., Lawrence R. Klein, and Kanta Fernández-Villaverde, Jesús, and Juan F. Rubio- Marwah. 1991. A History of Macroeconometric Ramírez. 2005. “Estimating Dynamic Equilibrium Model-Building. Aldershot, U.K. and Brookfield, Economies: Linear versus Nonlinear Likelihood.” Vt.: Elgar. Journal of Applied Econometrics, 20(7): 891–910. Boivin, Jean, and Marc Giannoni. 2006. “DSGE Mod- Fernández-Villaverde, Jesús, and Juan F. Rubio- els in a Data-Rich Environment.” National Bureau Ramírez. 2007. “Estimating Macroeconomic Mod- of Economic Research Working Paper 12772. els: A Likelihood Approach.” Review of Economic Brayton, Flint, Andrew Levin, Ralph Lyon, and John Studies, 74(4): 1059–87. C. Williams. 1997. “The Evolution of Macro Models Fernández-Villaverde, Jesús, and Juan F. Rubio- at the Federal Reserve Board.” Carnegie–Rochester Ramírez. 2008. “How Structural Are Structural Conference Series on Public Policy, 47: 43–81. Parameter Values?” In NBER Macroeconomics Browning, Martin, , and James J. Annual 2007, ed. Daron Acemoglu, Kenneth Rogoff, Heckman. 1999. “Micro Data and General Equi- and Michael Woodford, 83–138. Chicago and Lon- librium Models.” In Handbook of Macroeconomics, don: University of Chicago Press. Volume 1a, ed. John B. Taylor and Michael Wood- Fernández-Villaverde, Jesús, Juan F. Rubio-Ramírez, ford, 543–633. Amsterdam; New York and Oxford: and Manuel S. Santos. 2006. “Convergence Proper- Elsevier Science, North-Holland. ties of the Likelihood of Computed Dynamic Mod- Canova, Fabio, and Luca Sala. 2006. “Back to Square els.” Econometrica, 74(1): 93–119. One: Identification Issues in DSGE Models.” Euro- Goldberger, Arthur S. 1959. Impact Multipliers and pean Central Bank Working Paper 583. Dynamic Properties of the Klein–Goldberger Christiano, Lawrence J., and Martin Eichenbaum. Model. Amsterdam: North-Holland. 1990. “Unit Roots in Real GNP: Do We Know, and Granger, Clive W. J. 1981. “Some Properties of Time 702 Journal of Economic Literature, Vol. XLVI (September 2008)

Series Data and Their Use in Econometric Model Lubik, Thomas A., and Paolo Surico. 2006. “The Lucas Specification.” Journal of Econometrics, 16(1): Critique and the Stability of Empirical Models.” 121–30. Federal Reserve Bank of Richmond Working Paper Greenwood, Jeremy, Zvi Hercowitz, and Per Krusell. 06-05. 1997. “Long-Run Implications of Investment-Spe- Lucas, Robert E., Jr. 1976. “Econometric Policy Evalu- cific Technological Change.” American Economic ation: A Critique.” Carnegie–Rochester Conference Review, 87(3): 342–62. Series on Public Policy, 1: 19–46. Greenwood, Jeremy, Zvi Hercowitz, and Per Krusell. Marcet, Albert, and Thomas J. Sargent. 1989. “Con- 2000. “The Role of Investment-Specific Technologi- vergence of Least Squares Learning Mechanisms in cal Change in the Business Cycle.” European Eco- Self-Referential Linear Stochastic Models.” Journal nomic Review, 44(1): 91–115. of Economic Theory, 48(2): 337–68. Guerrón, Pablo. 2007. “What You Match Does Matter: McConnell, Margaret M., and Gabriel Pérez-Quirós. The Effects of Data on DSGE Estimation.” North 2000. “Output Fluctuations in the United States: Carolina State University Department of Economics What Has Changed since the Early 1980’s?” Ameri- Working Paper Series 012. can Economic Review, 90(5): 1464–76. Hall, Robert E. 1978. “Stochastic Implications of the Murchison, Stephen, and Andrew Rennison. 2006. Life Cycle–Permanent Income Hypothesis: Theory “ToTEM: The Bank of Canada’s New Quarterly Pro- and Evidence.” Journal of Political Economy, 86(6): jection Model.” Bank of Canada Technical Report 971–87. 97. Hansen, Lars Peter. 1982. “Large Sample Properties Nelson, Charles R. 1972. “The Prediction Perfor- of Generalized Method of Moments Estimators.” mance of the FRB–MIT–PENN Model of the U.S. Econometrica, 50(4): 1029–54. Economy.” American Economic Review, 62(5): Harrison, Richard, Kalin Nikolov, Meghan Quinn, 902–17. Gareth Ramsay, Alasdair Scott, and Ryland Thomas. Nelson, Charles R., and Charles I. Plosser. 1982. 2005. The Bank of England Quarterly Model. Lon- “Trends and Random Walks in Macroeconomic Time don: Bank of England. http://www.bankofengland. Series: Some Evidence and Implications.” Journal of co.uk/publications/other/beqm/beqmfull.pdf Monetary Economics, 10(2): 139–62. Iskrev, Nikolay. 2007. “How Much Do We Learn from Orphanides, Athanasios. 2001. “Monetary Policy Rules the Estimation of DSGE Models? A Case Study of Based on Real-Time Data.” American Economic Identification Issues in a New Keynesian Business Review, 91(4): 964–85. Cycle Model.” Unpublished. Phillips, Peter C. B. 1987. “Time Series Regression Judd, Kenneth L. 1992. “Projection Methods for Solv- with a Unit Root.” Econometrica, 55(2): 277–301. ing Aggregate Growth Models.” Journal of Economic Primiceri, Giorgio E. 2006. “Why Inflation Rose and Theory, 58(2): 410–52. Fell: Policy-Makers’ Beliefs and U.S. Postwar Sta- Judd, Kenneth L., and Sy-ming Guu. 1992. “Perturba- bilization Policy.” Quarterly Journal of Economics, tion Solution Methods for Economic Growth Mod- 121(3): 867–901. els.” In Economic and Financial Modelling with Robinson, Peter M., ed. 2003. Time Series with Long Mathematica, ed. Hal Varian, 80–103. New York: Memory. Oxford and New York: Oxford University Springer-Verlag. Press. Kilponen, Juha, and Antti Ripatti. 2006. “Introduction Sargent, Thomas J. 1989. “Two Models of Measure- to AINO.” Unpublished. ments and the Investment Accelerator.” Journal of Klein, Lawrence R. 1950. Economic Fluctuations in the Political Economy, 97(2): 251–87. United States, 1921–1941. New York: John Wiley. Sargent, Thomas J., Noah Williams, and Tao Zha. 2006. Klein, Lawrence R., and Arthur S. Goldberger. 1955. “Shocks and Government Beliefs: The Rise and Fall An Econometric Model of the United States, 1929– of American Inflation.”American Economic Review, 52. Amsterdam: North-Holland. 96(4): 1193–1224. Kortelainen, Mika. 2002. “EDGE: A Model of the Schmitt-Grohé, Stephanie, and Martín Uribe. 2003. Euro Area with Applications to Monetary Policy.” “Closing Small Open Economy Models.” Journal of Bank of Finland Studies E23. International Economics, 61(1): 163–85. Krusell, Per, and Anthony A. Smith. 1998. “Income Schmitt, Grohé, Stephanie, and Martín Uribe. 2006. and Wealth Heterogeneity in the Macroeconomy.” “Optimal Fiscal and Monetary Policy in a Medium- Journal of Political Economy, 106(5): 867–96. Scale .” In NBER Macroeco- Kydland, Finn E., and Edward C. Prescott. 1982. nomics Annual 2005, ed. Mark Gertler and Kenneth “Time to Build and Aggregate Fluctuations.” Econo- Rogoff, 383–425. Cambridge and London: MIT metrica, 50(6): 1345–70. Press. Leeper, Eric M., and Tao Zha. 2003. “Modest Policy Scott, James C. 1998. Seeing Like a State: How Cer- Interventions.” Journal of Monetary Economics, tain Schemes to Improve the Human Condition 50(8): 1673–1700. Have Failed. New Haven and London: Yale Univer- Liu, Ta-Chung. 1960. “Underidentification, Structural sity Press. Estimation, and Forecasting.” Econometrica, 28(4): Sims, Christopher A. 1980. “Macroeconomics and 855–65. Reality.” Econometrica, 48(1): 1–48. Fernádez-Villaverde: Horizons of Understanding 703

Sims, Christopher A., and Tao Zha. 2006. “Were There Business Cycle Changed and Why?” In NBER Mac- Regime Switches in U.S. Monetary Policy?” Ameri- roeconomics Annual 2002, ed. Mark Gertler and can Economic Review, 96(1): 54–81. Kenneth S. Rogoff, 159–218. Cambridge and Lon- Smets, Frank, and Raf Wouters. 2003. “An Estimated don: MIT Press. Dynamic Stochastic General Equilibrium Model of Tinbergen, Jan. 1937. An Econometric Approach the Euro Area.” Journal of the European Economic to Business Cycles Problems. Paris: Herman et Association, 1(5): 1123–75. Compagnie. Stock, James H., and Mark W. Watson. 1999. “Fore- Tinbergen, Jan. 1939. Statistical Testing of Busi- casting Inflation.” Journal of Monetary Economics, ness-Cycle Theories, Volume 2, Business Cycles in 44(2): 293–335. the United States of America, 1919–1932. Geneva: Stock, James H., and Mark W. Watson. 2002. “Macroeco­ . nomic Forecasting Using Diffusion Indexes.” Journal Woodford, Michael. 2003. Interest and Prices: Foun- of Business and Economic Statistics, 20(2): 147–62. dations of a Theory of Monetary Policy. Princeton Stock, James H., and Mark W. Watson. 2003. “Has the and Oxford: Princeton University Press.