<<

InterviewWith Lars PeterHansen

1. and Alastair Hall (Editors): How did you aspredictors of future spot rates. When the forward cometo be interested in estimation based on moment contractinterval exceeds the sampling interval, tem- conditions? poraldependence is induced.Given the serial correla- tionin thedisturbance term, many applied researchers LarsPeter Hansen (L.P .H.): As agraduatestu- atthe time thought the right thing to do was touse a dentat the , I hadthe opportu- standardgeneralized least squares (GLS)-type correc- nityto take classes from ChrisSims andT omSargent. tion.In fact, while least squares remains consistent, Bothemphasized the idea that dynamic econometric thelack of strictexogeneity of theregressors prevents modelsshould be viewed as restrictions on stochas- GLSfrom beingconsistent. This tripped up several ticprocesses. Sims’ s classroomdevelopment of large- researchersand was theimpetus for myoriginal least sampleeconometrics broke with the more conventional squarespaper. In addition to the Brown and Maital viewof thetime of focusingon modelswith exogenous paper,which with Sims’ s inuence proceeded cor- regressorsor instrumentalvariables (IVs). His workon rectly,Hodrick quickly informed authors of thepapers causalityhad emphasized the testable restrictions on submittedto the Journalof PoliticalEconomy that the stochasticprocesses of thenotion of strictexogeneity. GLSstylecorrection that they had used was invalid. Theidea of embedding estimation and testing prob- Myleast squares paper was rejectedat Economet- lemswith a stochasticprocess framework provedto be rica becauseit failed to be ambitious enough. This averyuseful starting point. Although this is common irritatedme and made me restructure the arguments ineconometrics today, it was notat the time I was a inmuch greater generality. I hadseen recent lecture graduatestudent. notesof Sims thatdescribed a familyof generalized Iwas exposedto nonlinear estimation and testing methodof moments (GMM) estimators,indexed by a problemsin Sims’ s graduatecourse. Sims loved to matrixthat selected which moment conditions to use experimentwith new material and proofs in class, and inestimation. Sims treatedtime series applications, Ilearnedmuch from Žllingin details and some holes butwith a focuson martingaledifference disturbances. inlecture notes. His nonconventionalteaching style Iadoptedthis formulation to present a moregeneral was valuablefor me. centrallimit approximation for estimatorswith Sims’ s WhileI was workingon my dissertation on encouragement.I learnedsubsequently from areferee exhaustibleresources, I becameinterested in central thatthe selection matrix idea had been used by Sargan limittheory for temporallydependent processes. This (1958,1959) in studies of linear and nonlinear instru- interestpredated the excellent book by Halland Heyde mentalvariables estimators. I was abitembarrassed (1980).Sims hadmade reference to work by Scott thatI hadnot cited the very nice Sargan (1958) paper (1973),but the central paper for mewas thatof Gordin inmy original submission. This paper provided the (1969).(Scott indeed cites Gordin, which is how I impetusfor myanalysis of the limiting distribution becamefamiliar with that work.) Gordin showed how ofsample moment conditions evaluated at parameter toconstruct a martingaleapproximation to a wide estimators. classof stationary ergodic processes. It thus extended Sinceprevious referees complainedabout my casual theapplicability of earliermartingale central limit the- treatmentof consistency, I addedsome speciŽ city by oryby Billingsley (1961) and others to processeswith usinga quadraticform constructionas in the statis- arichtemporal dependence structure. ticsliterature on minimumchi-squared estimation and 2. Editors: Withhindsight, it can be seen that your paper inAmemiya’ s (1974,1977) treatments of nonlinear (Hansen1982) has had considerable in uence on the two-and three-stage least squares. While developing practiceof . What was your perspective on consistencyresults, I becameinterested in the role of thework at thetime? Can you also tell us aboutthe initial compactnessin consistency arguments and explored reactionof others(including referees!) to yourpaper? alternativeways of justifying consistency based on tail characterizationsof the objective functions. L.P.H.: Ioriginallywrote a paperon least squares estimationwith temporally dependent disturbances. 3. Editors: Yourpapers with Ken Singleton in Economet- Themotivation for thispaper was simple.As an rica(Hansen and Singleton 1982) and Bob Hodrick in editor of Econometrica ,Simswas handlinga paper theJournal of P oliticalEconomy (Hansen and Hodrick 1980)were very in uential in demonstrating the poten- byBrown and Maital (1981) on assessing the efŽ - tialpower of GMM inapplications. Can you tell us how ciencyof multiperiod forecasts. The multiperiod thesecollaborations came about? natureinduced temporal dependence in the distur- banceterm. Sims urgedme to proceed quickly to get somethingwritten up as a paperthat could be cited © 2002 American StatisticalAssociation andused. Hodrick (my colleague at Carnegie Mellon Journal of Business &Economic Statistics atthe time) made me aware ofsimilar problems in October2002, Vol.20, No.4 theliterature on the study for forward exchangerates DOI10.1198/ 073500102288618577

442 InterviewWith Lars PeterHansen 443

L.P.H.: BobHodrick and I beganour discussion 5. Editors: Inthe mid-1980s, you worked on the problem ofeconometric issues when I describedto him the ofcharacterizing the optimal instrument in generalized pitfallsin applying GLS ineconometric equations IV estimation.What led you to work on thisproblem and thatcome from multiperiodforecasting problems. He whatis your perspective on this literature today? immediatelyshowed me working papers in the liter- L.P.H.: Iworkedon this problem because many atureon forward exchangerates in which GLS was applicationsof GMM estimationwere motivatedby appliedas aremedyfor serialcorrelation. At thesame conditionalmoment restrictions or had the following timehe was educatingpeople in international Ž nance, timeseries structure. Any time an IV existed,lags of heand I beganworking on our own analysis of for- theIVs were alsovalid instruments. As inmy other ward exchangerates by applying least squares and work,I was particularlyinterested in time series prob- adjustingthe standard errors. lems.Strict exogeneity of IVs alsoallows for leading BobHodrick was atCarnegie Mellon when I arrived valuesof IVs tobe valid instruments, but this orthog- there.Ken Singleton came to Carnegie Mellon after onalitywas notin my econometric formulation. Prior me,and after I hadwritten my GMM paper.Both of tomyGMMpaper,Robert Shiller (1972) had empha- usknew Hall’ s workon consumption(Hall 1978), and sizedthat in rational expectations models, omission of Kencame back from aconferenceafter hearing an conditioninginformation gave rise to orthogonal dis- underappreciatedpaper by Grossman and Shiller .A turbance.In a timeseries problem, this Shiller notion stripped-downversion of this paper was subsequently ofan error termoften had the feature that this error publishedin the AmericanEconomic Review (Gross- termwas orthogonalto arbitrary past values of vari- manand Shiller 1981). Singleton suggested that the ablesin a conditioninginformation set used by an consumptionEuler equation could be fruitfully posed econometrician.This same argument will not work for asa conditionalmoment restriction and our collabora- arbitraryleading values of the conditioning informa- tionbegan. After Iarrivedat Chicago, Jim Heckman tion,because economic agents will not have seen these showedme MaCurdy’ s microeconomiccounterpart to values. Hall’s paper(MaCurdy 1978). Ken’ s andmy second Myinitial work in this area (Hansen 1985) used (andempirically more interesting) paper published in martingaleapproximations to give lower bounds on the Journalof PoliticalEconomy (Hansenand Single- theefŽ ciency of a classof feasible GMM problems. ton1983) came about by our construction of a maxi- Informulating this problem, I hadto extend the selec- mumlikelihood counterpart to the GMM estimationin tionapproach of Sargan, Sims, and others to cases in our Econometrica paper.Given the more transparent whichthe number of underlying moment conditions natureof howpredictability had implications measur- isinŽ nite. This is a naturalframework withinwhich ingand assessing risk aversion, this project became tothink about estimation problems in which nonlin- moreambitious and took on a lifeof its own. earfunctions of IVs arevalid IVs andparticularly for 4. Editors: Whatis yourperspective on howGMM Žtsinto timeseries problems in which lagged values of IVs thewider literature on statistical estimation? remainvalid instruments. This gave rise to an explic- itlyinŽ nite-dimensional family of GMM estimators. L.P.H.: Ispentsome time on this question in Attainingthis efŽ ciency, especially in a timeseries myrecent contribution to the Encyclopediaof the problem,is moreproblematic. There is aneedasymp- Socialand Behavioral Sciences (Hansen2002). The toticallyto useinformation arbitrarily far intothe past. quadratic-formcriterion that I usedto formulate con- WhileI was workingin this area, I becameaware sistencycertainly has its origins in the minimum chi- ofclosely related work in engineering aiming to con- squaredestimators used to produce estimators that structonline estimators that are easy to compute (see, arestatistically efŽ cient and computationally tractable. e.g.,Soderstrum and Stoica 1983). Some of theircal- Aninteresting difference is that GMM estimatorsare culationswere similarto mine. Also, Gary Chamber- oftenused to studymodels that are only partially spec- lainwrote his very nice paper (Chamberlain 1987) on iŽed, whereas the earlier statistics literature provided semiparametricefŽ ciency bounds based on conditional computationallyattractive estimators for fullyspeci- momentrestrictions parameterized in terms of aŽnite- Žedmodels. Implicit in many GMM applicationsis dimensionalvector and posed in an i.i.d. context. asemiparametricnotion of estimation and efŽ ciency, After myinitial characterizations, Singleton and I incontrast to minimum chi-squared estimation. There tooka stabat implementation in linear models using isa morerecent related statistics literature on esti- KalmanŽ ltering(see Hansenand Singleton 1991; matingequations, with time series contributions by 1996).In many of ourexamples, we oftenfound only Godambeand Heyde (1987) and others. This litera- verymodest gains in even asymptotic efŽ ciency to turewas developedlargely independently of the cor- usingthe more fancy procedure over ad hoc choices respondingliterature in econometrics, but its focus ofvariables.While a co-editorat Econometrica , I han- hasbeen primarily on martingale estimating equations. dledone of Whitney Newey’ s papers(Newey 1990) Myaim was inpartto useGordin (1969)’ s martingale that,in an i.i.d. context, attained the efŽ ciency bound. approximationdevice to adopt a moregeneral starting Inthe sampling experiments that he studied, the efŽ - point. ciencygains were oftensmall, and sometimes the 444 Journal of Business& EconomicStatistics, October 2002

actualconstruction of anonparametricestimator of an 7. Editors: Inyour paper written with Heaton and Y aron efŽcient IV underminedthe efŽ ciency gains in Ž nite inJournal of Business & EconomicStatistics (Hansen, samples.Of course,characterization of the efŽ ciency Heaton,and Y aron1996), you proposed the continuous- boundwas neededto reach this conclusion. updatingGMM estimator.Canyou tell us something Itis interestingthat subsequently, W estand Wilcox aboutthe background of this paper? (1996)reached a ratherdifferent conclusion and pro- ducedinteresting examples in the study of economic L.P.H.: Initialimplementation of GMM estima- timeseries where the use of the efŽ ciency bounds torsin terms of quadraticform minimizationinvolved were ofpractical value, and they devised some esti- two-stepapproaches or iterated approaches. An initial mationmethods that appear to work quite well. consistentestimator was producedand used to esti- Therehas been a varietyof other interesting and matean efŽ cient weighting matrix designed so that relatedresearch on choice of moment conditions to theobjective has the minimum chi-squared property. usein actual estimation. Complementary work of Inthe Ž rst-orderasymptotic theory, weighting matrix Andrews (1999)and Hall and Inoue (2001) consid- estimationis placed in the background, because all eredthe selection of which moment conditions to thatis neededis aconsistentestimator .Stoppingafter usein parameter estimation. These papers addressed oneiteration or continuing are both options. theimportant practical problem of how to limit the Itwas interesting,though, that when Sargan sought numberof moment conditions used in estimation, by tocompareIVs estimatorsto limitedinformation max- excludingirrelevant and invalid ones using adaptive imumlikelihood (LIML) estimators,he produced a methods. depictionof the LIML estimator as an IV -typeesti- matorin which the variance for thestructural distur- 6. Editors: Methodof moments is commonly used now banceterm is estimated at the same time the under- inthe estimation of diffusions. Do you expect that this lyingparameters are estimated (Sargan 1958). He lineof research will continue, or do you think that the essentiallyconcentrated out this variance in terms varioussimulation-based maximum likelihood estimators ofthe underlying parameters, and produced what willbecome more widely used? canbe thought of as a continuous-updatedweight- L.P.H.: Myinterest in GMM estimationhas been ingmatrix estimator. Under conditional homoscedas- primarilyto achievepartial identiŽ cation. That is, sup- ticityand a martingaledifference structure for thedis- posethe aim of theeconometric exercise is to extract turbanceterm, the optimal weighting matrix is the apieceof saya fullyspeciŽ ed dynamic general equi- productof the disturbance term variance and the sec- libriummodel. Thus the semiparametric language that ondmoment of the IVs. Byusing the sample second- Chamberlainemphasized in his work is appropriate. momentmatrix of the IV andthe sample variance of Mywork with Scheinkman on estimation of nonlin- thedisturbance term expressed as a functionof the earcontinuous-time models (Hansen and Scheinkman unknownparameter vector, in a effecta continuous- 1995)and the subsequent work with Conley and updatingweighting matrix is produced. Luttmer(Conley, Hansen, Luttmer, and Scheinkman Averysimilar approach is one of theways that the 1997)is best viewed in this light. The idea is thatthe minimumchi-squared estimator was implementedfor modelis not designed to Ž teverything,but that there multinomialmodels. The counterpart to the weight- arecertain data features such as the long-run station- ingmatrix could be fully parameterized in terms of arydistribution that are the appropriate targets. Both themultinomial probabilities. In this case the weight- ofthesepapers also explore misspeciŽ cation that takes ingmatrix can be produced without reference to the theform ofanexogenoussubordination process in the data,but only a functionof the unknown parameters. spiritof Clark’ s fundamentalwork (Clark 1973). Thisis because the multinomial model is afullmodel Of course,partial identiŽ cation while it may be ofthe data, while Sargan considered only a partially morerealistic, is not a fullyambitious research goal. speciŽed model. For manypurposes, more is needed. But once a Inthe more general GMM contextwith a quadratic dynamicmodel is fully speciŽ ed in a parametric form–type objective, it turned out to be quite easy way,maximum likelihood methods and their Bayesian toimplement Sargan’ approach, except that the sim- counterpartbecome attractive. It may be that for pleseparation between the disturbance term and the numericalreasons, as in Pearson’ s originalwork and vectorof IVs canno longer be exploited. Instead, thesubsequent extensive literature on minimum chi- oneconstructs the long-run covariance matrices of the squaredestimation, method of moments methods are functionof thedata and parameter vector used to pro- attractivealternatives to likelihood-type estimators. It ducethe parameterized moment condition. hasbeen interesting, however, to watch the develop- Ibecameinterested in this estimator in part to mentof numerical Bayesian methods, methods that sidestepthe issue of whetherto iterate on the weight- makeprior sensitivity analyses feasible. The Bayesian ingmatrix or not and in part because the sensitivity problembased on integrating or averaging is numeri- ofboth the two-step and iterated estimator to what callysimpler than hill-climbing to Ž ndamaximumin seemedlike arbitrary normalizations. For instance,if manycircumstances. onetakes the original function of the data xt and a InterviewWith Lars PeterHansen 445

hypotheticalparameter vector ‚ andscales it by some Supposethat, strictly speaking, the model is misspec- arbitraryfunction of ‚,themoment conditions are pre- iŽed. How mightyou pick a goodmodel among the served.The two-step estimator will be sensitiveto this parameterizedfamily of models? How doesthe per- transformation,but the continuous-updatin gestimator formanceof a parameterizedfamily compare to other typicallywill not be. Moreover, in GMM estimation models?This leads to a ratherdifferent view .Under problemsthat Marty Eichenbaum and I encountered misspeciŽcation, statistical efŽ ciency is put to theside. (Eichenbaumand Hansen 1990), we foundthat in one Itbecomes a “higher-order”issue. W eexplorethe depictionof the moment conditions, there might be questionof weighting matrix selection in an asset- degeneratesolutions at the boundary of the parameter pricingenvironment and pose the problem as one in space.The continuous-updatin gestimatorwould not whichthe aim is to keep pricing errors small.The tar- reward theseparameter values, because the degener- getof the estimation problems of necessity comes to acywould cause the weighting matrix to diverge. In theforefront in this exercise. contrast,a two-stepestimator could easily end up at Consideran idealized Bayesian decision problem, theseoften-uninteresting parameter conŽ gurations. butsuppose that the likelihood function is misspec- Inour original paper on continuous-updated GMM iŽed. One cannot separate this problem into the estimation,Heaton, Y aronand I foundin our Monte followingtwo problems. Find the correct posterior Carloexperiments that we tendedto replicate what distributionfor theparameter vector and then solve the was knownin the simultaneous equations literature. decisionproblem by minimizing the loss function aver- Thecontinuous-updating estimator was closeto being agingover the posterior. In the presence of misspeciŽ- medianunbiased, but it had fat tails. The weighting cation,we cannotseparate the statistician’ s problemof matrixadjustment often leads to problems in which Žndingthe posterior from thatof the decision maker theobjective function itself can be  atin the tails. whowishes to use the parameter vector in practice. Wefoundthat inference methods based on studying Considernow the Ž ctionof an inŽ nite sample. In theshape and degradation of the objective function aGMMcontext,the weighting matrix that you refer were morereliable than computing quadratic approx- todictates how important each of the pricing error imations.One formal reason for whythis can be true equationsis in pinning down the parameter estimate. isprovided in the paper by Stock and Wright (2000), However,under correct speciŽ cation, this choice does whichstudies moment conditions with weak informa- notalter the parameter vector that, say, minimizes the tion. populationquadratic criterion. Under misspeciŽ cation, Ihavefound the recent work by Newey and thechoice of weightingmatrix changes parameter val- Smith(2000) and Bonnal and Renault (2000) to be uesas you change the weighting matrix, but this also veryinteresting. These papers show how to nest the changesthe performance criterion. In an asset-pricing continuous-updatedestimator into a classof estima- context,different weighting matrices alter the impor- torsthat includes empirical likelihood. Newey and tanceof pricingerrors comingfrom alternativesecuri- Smith(2000), in particular, use second-order asymp- ties.Parameter choices that lead to moment conditions totictheory to study this class of estimators. They becauseof a largecovariance matrix in a centrallimit characterizethe advantages in using empirical likeli- approximationare rewarded for reasonsthat may not be hoodto produce parameter estimates when the data veryappealing. In its most simple form, Raviand I jus- arei.i.d. My own interest in the continuous-updatin g tiŽed an estimationexercise based on minimizingpric- GMMestimator is not so much as a methodfor pro- ingerrors thatled to aŽxedchoice of weightingmatrix ducingpoint estimates, but more as a methodof mak- independentof the parametervalue being considered. ingapproximate inference. SincemisspeciŽ cation can destroy the simple sepa- rationbetween estimation and decision, the aim of the 8. Editors: Alotof work has focused on estimation of parameterchoice comes to the forefront. Ravi and I theoptimal weighting matrix, which lead to the so- canbe criticized for ourad hoc choice of loss func- calledHAC (heteroskedastic and autocorrelation consis- tion,but it is also sometimes unappealing to study tent)estimators. In joint work with Ravi Jagannathan, momentrelations in which the weighting matrix can youalso advocate the use of weighting matrices that changewith the underlying parameter vector .Model aresuboptimal from a statisticalpoint of view, but have misspeciŽcation is an example. desirableproperties in Ž nancialapplications. What are 9. Editors: GMMhasproven particularly valuable for the yourthoughts on the issue of weighting matrices? estimationof rational expectation models, because it L.P.H.: Theoptimal weighting matrix that you facilitatesestimation based on Euler equations without refer tois obtained by asking a statisticalquestion. theneed to impose strong explicit distribution assump- Giventhat a Žniteset of moment conditions are satis- tions.In recent years, you have turned your attention Žed,what is the most efŽ cient linear combination to tojoint research with T omSargent to robust decision usein estimating a parametervector of interest. Ravi problems.In this framework, agents are assumed to take andI (Hansenand Jagannathan 1997) and also John modelmisspeciŽ cation into account while making deci- Heaton,Erzo Luttmer, and I (Hansen,Heaton, and sions.This comes at a costof very explicit assump- Luttmer1995) were interestedin a differentquestion. tionsabout the probability laws governing the economic 446 Journal of Business& EconomicStatistics, October 2002

environmentand takes us away from the “ distribution- theory,but this is far from amatureliterature. Our free”approach of GMM-based Euler equation estima- workto datehas been very controversial, and some of tion.What are your thoughts on this development? thiscontroversy is well justiŽ ed. W earestill sorting outstrengths and weakness of alternative approaches L.P.H.: GMMapproachesbased on Euler equa- inour own minds. tionswere designedto deliver only part of an eco- Our robustnessapproach is related more closely to nomicmodel. Their virtue and liability is that they Sargent’s andmy work (Hansen and Sargent 1993) arebased on partial speciŽ cation of an econometric andrelated work by Sims (1993) on estimation of model.They allow an applied researcher to study a fullyspeciŽ ed dynamic rational expectations models pieceof a fulldynamic model, without getting hung usinga misspeciŽed likelihood function than it is to upon the details of the remainder of the model. GMMestimationof partially speciŽ ed models. Since For manypurposes, including policy analysis and the we arecompelled to work with well-posed decision conductof comparative dynamic exercises, the entire problems,we needthe entire model. modelhas to be Ž lledout. W ecouldrequire this full speciŽcation of all econometric exercises in applied 10.Editors: Over thelast 20 years, there has been much timeseries, but I thinkthis would be counterproduc- progressin developinga frameworkfor inference based tive.On the other hand, to solve decision problems onthe GMM estimator.Whatdo youperceive to be the bypolicy makers or to conduct comparative dynamic strengthsand weaknesses of theframework as itstands exercisesa fulldecision problem, and a complete today? dynamicevolution must be speciŽ ed. A GMMesti- mationof aportionof themodel can be aninput into L.P.H.: GMMestimationis often best suited this,but more is required. for modelsthat are partially speciŽ ed. This is both Rationalexpectations compels a researcherto think astrengthand a weakness.Partial speciŽ cation is aboutmodel speciŽ cation both from thestandpoint of convenientfor avarietyof reasons. It allows an theapplied researcher and from thestandpoint of indi- econometricianto learn about something without vidualdecision makers. W ecouldpresume that when needingto learn about everything.Anappeal to par- we Žllout the entire model including the dynamic tialspeciŽ cation, however, limits the questions that evolutionof state variables, this is done correctly. canbe answered by an empirical investigation. For Rationalexpectations becomes an equilibrium con- instance,the analysis of hypothetical interventions structthat compels the researcher to check for correct orpolicy changes typically requires a fullyspeci- speciŽcation from allangles, econometrician and eco- Žeddynamic, stochastic general equilibrium model. nomicagents. It is a powerfuland tremendously pro- Appliedresearchers in and Ž nance ductiveequilibrium concept. usehighly stylized, and hence misspeciŽ ed, ver- Thequestion is what do you do if you suspect sionsof such models. Such models are often stud- misspeciŽcation. Interestingly, if you look at applied iedthrough numerical characterization for alternative researchin economic dynamics that imposes rational parameterconŽ gurations. It remains an important expectations,one of the reasons people give for not challengeto econometricsto give empirical credibil- gettingdistracted by econometrics is that the model ityto theestimation and testing of suchmodels. isobviously misspeciŽ ed. Once you go down this road,however, you naturally ask from whoseper- spective.The consistency requirement of the equi- REFERENCES libriumconcept forces this misspeciŽ cation on the Amemiya, T.(1974), “ TheNonlinear Two-Stage Least SquaresEstimator,” economicagents, and the description “ rationalexpec- Journalof Econometrics, 2, 105–110. tations”becomes a bitof a misnomer.Thereis noobvi- (1977),“ TheMaximum Likelihood and Nonlinear Three-Stage Least SquaresEstimator in theGeneral NonlinearSimultaneous Equations ouslysimple Ž xtothis problem. W ecouldremove Model,” Econometrica, 45,955– 968. misspeciŽcation from thetable as rational expectations Andrews,D. W.K.(1999),“ ConsistentMoment Selection Procedures for doesin such a cleverway, but this seems incompati- Generalized Methodof MomentsEstimation,” Econometrica, 67,543– 564. blewithmuch applied research and arguably individual Billingsley,P .(1961),“ TheLindeberg– Levy Theorem for Martingales,” Proceedingsof theAmerican Mathematical Society, 12,250– 268. decisionmaking. Bonnal,H., and Renault, E. (2000),“ AComparisonof Alternative Moment Thework you describe (with a varietyof co- MatchingEstimators,” CREST -INSEE. authors)is one stab at confronting model misspeciŽ - Brown,B. W.,andMaital, S. (1981),“ WhatDo Economists Know? An EmpiricalStudy of Experts’Expectations,” Econometrica, 49,491– 504. cation.Since we arecompelled to put a richclass of Chamberlain,G. (1987),“ AsymptoticEfŽ ciency in Estimation With Condi- modelsin play, typically in the form ofan approxi- tionalMoment Restrictions,” Journalof Econometrics, 34,305– 334. matingor benchmark model and a familyof pertur- Clark,P .K.(1973),“ ASubordinatedStochastic Process ModelWith Finite Variance forSpeculative Prices,” Econometrica, 41,135– 155. bations,there are hard questions about which models Conley,T .G.,Hansen,L. P.,Luttmer,E. G.J.,and Scheinkman, J. A.(1997), shouldbe putinto contention and how they should be “Short-TermInterest Rates as SubordinatedDiffusions,” Review ofFinan- confrontedby decisionmakers. W ehavebeen working cialStudies, 10,525– 577. Eichenbaum,M., and Hansen, L. P.(1990),“ EstimatingModels With onsometractable ways to approachmodel misspeciŽ - IntertemporalSubstitution Using Aggregate Time Series Data,” Journal of cation,building from arichliterature in robust control Businessand Economic Statistics, 8, 53–69. InterviewWith Lars PeterHansen 447

Godambe,V .P.,andHeyde, C. C.(1987),“ Quasi-Likelihoodand Optimal Hansen,L. P.,andSingleton, K. J.(1982),“ Generalized Instrumental Estimation,” InternationalStatistical Review , 55,231– 244. Variables ofNonlinear Rational Expectations Models,” Econometrica, 50, Gordin,M. I.(1969), “ TheCentral Limit Theorem for Stationary Processes,” 1269–1286. SovietMath. Dokl., 10,1174– 1176. (1983),“ StochasticConsumption, Risk A version,and the Temporal Grossman,S. J.,and Shiller, R. J.(1981),“ TheDeterminants oftheV ariability Behaviorof Asset Returns,” Journalof PoliticalEconomy, 91,249– 265. ofStock Prices,” AmericanEconomic Review Papersand Proceedings, 71, (1991),“ ComputingSemiparametric EfŽciency Bounds for Linear 222–227. Time Series Models,”in Proceedingsof theFifth International Symposium Hall,A. R.,andInoue, A. (2001),“ ACanonicalCorrelations Interpretation of inEconomic Theory and Econometrics ,eds.W .A.Barnett,J. Powell,and Generalized Methodof Moments Estimation With Applications to Moment G.E.Tauchen,Cambridge, U.K.: Cambridge University Press, chap.15, Selection,”North Carolina State University,unpublished manuscript. pp.388– 411. Hall,P .,andHeyde, C. C.(1980), MartingaleLimit Theory and Its Applica- (1996),“ EfŽcient Estimation of Linear Asset PricingModels With tion, Boston:Academic Press. Moving-AverageErrors,” Journalof Business and Economic Statistics, 14, Hall,R. E.(1978),“ StochasticImplications of the Life Cycle Permanent 53–68. Income Hypothesis:Theory and Evidence,” Journalof Political , MaCurdy,T. E. (1978),“ TwoEssays onthe Life Cycle,”unpublished doctoral 86,339– 357. dissertation,,Department ofEconomics. Hansen,L. P.(1982),“ Large Sample Propertiesof Generalized Methodof Newey, W.K.(1990),“ EfŽcient Instrumental V ariables Estimationof Non- MomentsEstimators,” Econometrica, 50,1029– 1054. linearModels,” Econometrica, 58,809– 837. (1985),“ AMethodfor Calculating Bounds on Asymptotic Covari- Newey, W.K., and Smith, R. J.(2000),“ AsymptoticBias andEquivalence of ance Matrices ofGeneralized Methodof MomentsEstimators,” Journal of GMMand GEL,” unpublished manuscript. Econometrics, 30,203– 238. Sargan,J. D.(1958),“ TheEstimation of Economic Relationships Using (2002),“ Methodof Moments,” in InternationalEncylopedia of the Socialand Behavior Sciences, Oxford,U.K.: Pergamon. InstrumentalV ariables,” Econometrica, 26,393– 415. Hansen,L. P.,Heaton,J., andLuttmer, E. G.J.(1995),“ Econometric (1959),“ TheEstimation of RelationshipsWith Autocorrelated Resid- Evaluationof Asset PricingModels,” Review ofFinancial Studies, 8, ualsby theUse ofInstrumentalV ariables,” Journalof theRoyal Statistical 237–274. Society, 21, 91–105. Hansen,L. P.,Heaton,J. C.,and Y aron,A. (1996),“ FiniteSample Properties Scott,D. J. (1973),“ CentralLimit Theorems forMartingales and for Pro- ofSome Alternative GMM Estimators,” Journalof Business and Economic cesses WithStationary Increments Usinga SkorokhodRepresentation Statistics, 14,262– 280. Approach,” Advancesin Applied Probability, 5, 119–137. Hansen,L. P.,andHodrick, R. J.(1980),“ ForwardExchange Rates as Opti- Shiller,R. (1972),“ RationalExpectations and the Structure of Interest Rates,” mal Predictorsof Future Spot Rates,” Journalof P oliticalEconomy , 88, unpublisheddoctoral dissertation, Massachusetts Instituteof Technology. 829–853. Sims,C. A.(1993),“ RationalExpectations With Seasonally Adjusted Data,” Hansen,L. P.,andJagannathan, R. (1997),“ Assessing SpeciŽcation Errors in Journalof Econometrics, 55, 9–19. StochasticDiscount Factor Models,” Journalof , 52,557– 590. Soderstrum,T., and Stoica, P .G.(1983), InstrumentalV ariableMethods for Hansen,L. P.,andSargent, T .J.(1993),“ Seasonalityand Approximation System IdentiŽcation, Berlin:Springer-V erlag. Errorsin Rational Expectations Models,” Journalof Econometrics, 55, Stock,J. H.,and Wright, J. H.(2000),“ GMMWithWeak IdentiŽcation,” 21–55. Econometrica, 68,1055– 1096. Hansen,L. P.,andScheinkman, J. A.(1995),“ Back tothe Future: Gen- West,K. D.,and Wilcox, D. W.(1996),“ AComparisonof Alternative Ins- eratingMoment Implications for Continuous-Time Markov Processes,” tumentalV ariables Estimatorsof a Dynamic LinearModel,” Journal of Econometrica, 63,767– 804. Businessand Economic Statistics, 14,281– 293.