Policy Analysis with Econometric Models
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
A Monetary History of the United States, 1867-1960’
JOURNALOF Monetary ECONOMICS ELSEVIER Journal of Monetary Economics 34 (I 994) 5- 16 Review of Milton Friedman and Anna J. Schwartz’s ‘A monetary history of the United States, 1867-1960’ Robert E. Lucas, Jr. Department qf Economics, University of Chicago, Chicago, IL 60637, USA (Received October 1993; final version received December 1993) Key words: Monetary history; Monetary policy JEL classijcation: E5; B22 A contribution to monetary economics reviewed again after 30 years - quite an occasion! Keynes’s General Theory has certainly had reappraisals on many anniversaries, and perhaps Patinkin’s Money, Interest and Prices. I cannot think of any others. Milton Friedman and Anna Schwartz’s A Monetary History qf the United States has become a classic. People are even beginning to quote from it out of context in support of views entirely different from any advanced in the book, echoing the compliment - if that is what it is - so often paid to Keynes. Why do people still read and cite A Monetary History? One reason, certainly, is its beautiful time series on the money supply and its components, extended back to 1867, painstakingly documented and conveniently presented. Such a gift to the profession merits a long life, perhaps even immortality. But I think it is clear that A Monetary History is much more than a collection of useful time series. The book played an important - perhaps even decisive - role in the 1960s’ debates over stabilization policy between Keynesians and monetarists. It organ- ized nearly a century of U.S. macroeconomic evidence in a way that has had great influence on subsequent statistical and theoretical research. -
Sudhir Anand and Amartya Sen "Tell Me What You Eat,"
CONSUMPTION AND HUMAN DEVELOPMENT: CONCEPTS AND ISSUES Sudhir Anand and Amartya Sen 1. Introduction "Tell me what you eat," remarked Anthelme Brillat -Savarin nearly two hundred years ago, "and I will tell you what you aye." The idea that people can be known from their consumption behaviour has some plausibility. Eating enough and well nourishes us; over- eating renders us obese and unfit; education can make us wiser (or at least turn us into learned fools); reading poetry can make us sensitive; keeping up with the Joneses can overstretch our resources; and an obsession with fast cars may make us both "quicke and dead." There are few things more central than consumption to the lives that people variously lead. AIrrr~ ~nsumption is not the ultimate end of our lives. We seek consumption for a purpose, or for various purposes that may be simultaneously entertained. The role of consumption in human lives // cannot be really comprehended without some understanding of the ends that are pursued through consumption activities." tlur ends are enormously diverse, varying from nourishment to amusement, from living long to living well, from isolated self-fulfilment to interactive socialization. The priorities of human development, with which the Human Development Reports are concerned, relate to some basic human ends, 1 For a general introduction to the contemporary literature on consumption, see Angus Deaton and John Muellbauer, Economics and Consumer Behavior (Cambridge: Cambridge University Press, 1980). Consumption & Human Development Page 2 Background paper/ HDR1998 /UNDP Sudhir Anand & Amartya Sen and there is scope for scrutinizing how the prevailing consumption act i vi ties serve those ends. -
Implications of Rational Inattention
IMPLICATIONS OF RATIONAL INATTENTION CHRISTOPHER A. SIMS Abstract. A constraint that actions can depend on observations only through a communication channel with finite Shannon capacity is shown to be able to play a role very similar to that of a signal extraction problem or an adjustment cost in standard control problems. The resulting theory looks enough like familiar dynamic rational expectations theories to suggest that it might be useful and practical, while the implications for policy are different enough to be interesting. I. Introduction Keynes's seminal idea was to trace out the equilibrium implications of the hypothe- sis that markets did not function the way a seamless model of continuously optimizing agents, interacting in continuously clearing markets would suggest. His formal de- vice, price \stickiness", is still controversial, but those critics of it who fault it for being inconsistent with the assumption of continuously optimizing agents interacting in continuously clearing markets miss the point. This is its appeal, not its weakness. The influential competitors to Keynes's idea are those that provide us with some other description of the nature of the deviations from the seamless model that might account for important aspects of macroeconomic fluctuations. Lucas's 1973 clas- sic \International Evidence: : :" paper uses the idea that agents may face a signal- extraction problem in distinguishing movements in the aggregate level of prices and wages from movements in the specific prices they encounter in transactions. Much of subsequent rational expectations macroeconomic modeling has relied on the more tractable device of assuming an \information delay", so that some kinds of aggregate data are observable to some agents only with a delay, though without error after the delay. -
Economic Thinking in an Age of Shared Prosperity
BOOKREVIEW Economic Thinking in an Age of Shared Prosperity GRAND PURSUIT: THE STORY OF ECONOMIC GENIUS workingman’s living standards signaled the demise of the BY SYLVIA NASAR iron law and forced economists to recognize and explain the NEW YORK: SIMON & SCHUSTER, 2011, 558 PAGES phenomenon. Britain’s Alfred Marshall, popularizer of the microeconomic demand and supply curves still used today, REVIEWED BY THOMAS M. HUMPHREY was among the first to do so. He argued that competition among firms, together with their need to match their rivals’ distinctive feature of the modern capitalist cost cuts to survive, incessantly drives them to improve economy is its capacity to deliver sustainable, ever- productivity and to bid for now more productive and A rising living standards to all social classes, not just efficient workers. Such bidding raises real wages, allowing to a fortunate few. How does it do it, especially in the face labor to share with management and capital in the produc- of occasional panics, bubbles, booms, busts, inflations, tivity gains. deflations, wars, and other shocks that threaten to derail Marshall interpreted productivity gains as the accumula- shared rising prosperity? What are the mechanisms tion over time of relentless and continuous innumerable involved? Can they be improved by policy intervention? small improvements to final products and production Has the process any limits? processes. Joseph Schumpeter, who never saw an economy The history of economic thought is replete with that couldn’t be energized through unregulated credit- attempts to answer these questions. First came the pes- financed entrepreneurship, saw productivity gains as simists Thomas Malthus, David Ricardo, James Mill, and his emanating from radical, dramatic, transformative, discon- son John Stuart Mill who, on grounds that for millennia tinuous innovations that precipitate business cycles and wages had flatlined at near-starvation levels, denied that destroy old technologies, firms, and markets even as they universally shared progress was possible. -
The Physics of Optimal Decision Making: a Formal Analysis of Models of Performance in Two-Alternative Forced-Choice Tasks
Psychological Review Copyright 2006 by the American Psychological Association 2006, Vol. 113, No. 4, 700–765 0033-295X/06/$12.00 DOI: 10.1037/0033-295X.113.4.700 The Physics of Optimal Decision Making: A Formal Analysis of Models of Performance in Two-Alternative Forced-Choice Tasks Rafal Bogacz, Eric Brown, Jeff Moehlis, Philip Holmes, and Jonathan D. Cohen Princeton University In this article, the authors consider optimal decision making in two-alternative forced-choice (TAFC) tasks. They begin by analyzing 6 models of TAFC decision making and show that all but one can be reduced to the drift diffusion model, implementing the statistically optimal algorithm (most accurate for a given speed or fastest for a given accuracy). They prove further that there is always an optimal trade-off between speed and accuracy that maximizes various reward functions, including reward rate (percentage of correct responses per unit time), as well as several other objective functions, including ones weighted for accuracy. They use these findings to address empirical data and make novel predictions about performance under optimality. Keywords: drift diffusion model, reward rate, optimal performance, speed–accuracy trade-off, perceptual choice This article concerns optimal strategies for decision making in It has been known since Hernstein’s (1961, 1997) work that the two-alternative forced-choice (TAFC) task. We present and animals do not achieve optimality under all conditions, and in compare several decision-making models, briefly discuss their behavioral economics, humans often fail to choose optimally (e.g., neural implementations, and relate them to one that is optimal in Kahneman & Tversky, 1984; Loewenstein & Thaler, 1989). -
Statistical Decision Theory Bayesian and Quasi-Bayesian Estimators
Statistical Decision Theory Bayesian and Quasi-Bayesian estimators Giselle Montamat Harvard University Spring 2020 Statistical Decision Theory Bayesian and Quasi-Bayesian estimators 1 / Giselle Montamat 46 Statistical Decision Theory Framework to make a decision based on data (e.g., find the \best" estimator under some criteria for what \best" means; decide whether to retain/reward a teacher based on observed teacher value added estimates); criteria to decide what a good decision (e.g., a good estimator; whether to retain/reward a teacher) is. Ingredients: Data: \X " Statistical decision: \a" Decision function: \δ(X )" State of the world: \θ" Loss function: \L(a; θ)" Statistical model (likelihood): \f (X jθ)" Risk function (aka expected loss): Z R(δ; θ) = Ef (X jθ)[L(δ(X ); θ)] = L(δ(X ); θ)f (X jθ)dX Statistical Decision Theory Bayesian and Quasi-Bayesian estimators 2 / Giselle Montamat 46 Statistical Decision Theory Objective: estimate µ(θ) (could be µ(θ) = θ) using data X via δ(X ). (Note: here the decision is to choose an estimator; we'll see another example where the decision is a binary choice). Loss function L(a; θ): describes loss that we incur in if we take action a when true parameter value is θ. Note that estimation (\decision") will be based on data via δ(X ) = a, so loss is a function of the data and the true parameter, ie, L(δ(X ); θ). Criteria for what makes a \good" δ(X ), for a given θ: the expected loss (aka, the risk) has to be small, where the expectation is taken over X given model f (X jθ) for a given θ. -
Prisoners of Reason Game Theory and Neoliberal Political Economy
C:/ITOOLS/WMS/CUP-NEW/6549131/WORKINGFOLDER/AMADAE/9781107064034PRE.3D iii [1–28] 11.8.2015 9:57PM Prisoners of Reason Game Theory and Neoliberal Political Economy S. M. AMADAE Massachusetts Institute of Technology C:/ITOOLS/WMS/CUP-NEW/6549131/WORKINGFOLDER/AMADAE/9781107064034PRE.3D iv [1–28] 11.8.2015 9:57PM 32 Avenue of the Americas, New York, ny 10013-2473, usa Cambridge University Press is part of the University of Cambridge. It furthers the University’s mission by disseminating knowledge in the pursuit of education, learning, and research at the highest international levels of excellence. www.cambridge.org Information on this title: www.cambridge.org/9781107671195 © S. M. Amadae 2015 This publication is in copyright. Subject to statutory exception and to the provisions of relevant collective licensing agreements, no reproduction of any part may take place without the written permission of Cambridge University Press. First published 2015 Printed in the United States of America A catalog record for this publication is available from the British Library. Library of Congress Cataloging in Publication Data Amadae, S. M., author. Prisoners of reason : game theory and neoliberal political economy / S.M. Amadae. pages cm Includes bibliographical references and index. isbn 978-1-107-06403-4 (hbk. : alk. paper) – isbn 978-1-107-67119-5 (pbk. : alk. paper) 1. Game theory – Political aspects. 2. International relations. 3. Neoliberalism. 4. Social choice – Political aspects. 5. Political science – Philosophy. I. Title. hb144.a43 2015 320.01′5193 – dc23 2015020954 isbn 978-1-107-06403-4 Hardback isbn 978-1-107-67119-5 Paperback Cambridge University Press has no responsibility for the persistence or accuracy of URLs for external or third-party Internet Web sites referred to in this publication and does not guarantee that any content on such Web sites is, or will remain, accurate or appropriate. -
Statistical Decision Theory: Concepts, Methods and Applications
Statistical Decision Theory: Concepts, Methods and Applications (Special topics in Probabilistic Graphical Models) FIRST COMPLETE DRAFT November 30, 2003 Supervisor: Professor J. Rosenthal STA4000Y Anjali Mazumder 950116380 Part I: Decision Theory – Concepts and Methods Part I: DECISION THEORY - Concepts and Methods Decision theory as the name would imply is concerned with the process of making decisions. The extension to statistical decision theory includes decision making in the presence of statistical knowledge which provides some information where there is uncertainty. The elements of decision theory are quite logical and even perhaps intuitive. The classical approach to decision theory facilitates the use of sample information in making inferences about the unknown quantities. Other relevant information includes that of the possible consequences which is quantified by loss and the prior information which arises from statistical investigation. The use of Bayesian analysis in statistical decision theory is natural. Their unification provides a foundational framework for building and solving decision problems. The basic ideas of decision theory and of decision theoretic methods lend themselves to a variety of applications and computational and analytic advances. This initial part of the report introduces the basic elements in (statistical) decision theory and reviews some of the basic concepts of both frequentist statistics and Bayesian analysis. This provides a foundational framework for developing the structure of decision problems. The second section presents the main concepts and key methods involved in decision theory. The last section of Part I extends this to statistical decision theory – that is, decision problems with some statistical knowledge about the unknown quantities. This provides a comprehensive overview of the decision theoretic framework. -
After the Phillips Curve: Persistence of High Inflation and High Unemployment
Conference Series No. 19 BAILY BOSWORTH FAIR FRIEDMAN HELLIWELL KLEIN LUCAS-SARGENT MC NEES MODIGLIANI MOORE MORRIS POOLE SOLOW WACHTER-WACHTER % FEDERAL RESERVE BANK OF BOSTON AFTER THE PHILLIPS CURVE: PERSISTENCE OF HIGH INFLATION AND HIGH UNEMPLOYMENT Proceedings of a Conference Held at Edgartown, Massachusetts June 1978 Sponsored by THE FEDERAL RESERVE BANK OF BOSTON THE FEDERAL RESERVE BANK OF BOSTON CONFERENCE SERIES NO. 1 CONTROLLING MONETARY AGGREGATES JUNE, 1969 NO. 2 THE INTERNATIONAL ADJUSTMENT MECHANISM OCTOBER, 1969 NO. 3 FINANCING STATE and LOCAL GOVERNMENTS in the SEVENTIES JUNE, 1970 NO. 4 HOUSING and MONETARY POLICY OCTOBER, 1970 NO. 5 CONSUMER SPENDING and MONETARY POLICY: THE LINKAGES JUNE, 1971 NO. 6 CANADIAN-UNITED STATES FINANCIAL RELATIONSHIPS SEPTEMBER, 1971 NO. 7 FINANCING PUBLIC SCHOOLS JANUARY, 1972 NO. 8 POLICIES for a MORE COMPETITIVE FINANCIAL SYSTEM JUNE, 1972 NO. 9 CONTROLLING MONETARY AGGREGATES II: the IMPLEMENTATION SEPTEMBER, 1972 NO. 10 ISSUES .in FEDERAL DEBT MANAGEMENT JUNE 1973 NO. 11 CREDIT ALLOCATION TECHNIQUES and MONETARY POLICY SEPBEMBER 1973 NO. 12 INTERNATIONAL ASPECTS of STABILIZATION POLICIES JUNE 1974 NO. 13 THE ECONOMICS of a NATIONAL ELECTRONIC FUNDS TRANSFER SYSTEM OCTOBER 1974 NO. 14 NEW MORTGAGE DESIGNS for an INFLATIONARY ENVIRONMENT JANUARY 1975 NO. 15 NEW ENGLAND and the ENERGY CRISIS OCTOBER 1975 NO. 16 FUNDING PENSIONS: ISSUES and IMPLICATIONS for FINANCIAL MARKETS OCTOBER 1976 NO. 17 MINORITY BUSINESS DEVELOPMENT NOVEMBER, 1976 NO. 18 KEY ISSUES in INTERNATIONAL BANKING OCTOBER, 1977 CONTENTS Opening Remarks FRANK E. MORRIS 7 I. Documenting the Problem 9 Diagnosing the Problem of Inflation and Unemployment in the Western World GEOFFREY H. -
The Functional False Discovery Rate with Applications to Genomics
bioRxiv preprint doi: https://doi.org/10.1101/241133; this version posted December 30, 2017. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-ND 4.0 International license. The Functional False Discovery Rate with Applications to Genomics Xiongzhi Chen*,† David G. Robinson,* and John D. Storey*‡ December 29, 2017 Abstract The false discovery rate measures the proportion of false discoveries among a set of hypothesis tests called significant. This quantity is typically estimated based on p-values or test statistics. In some scenarios, there is additional information available that may be used to more accurately estimate the false discovery rate. We develop a new framework for formulating and estimating false discovery rates and q-values when an additional piece of information, which we call an “informative variable”, is available. For a given test, the informative variable provides information about the prior probability a null hypothesis is true or the power of that particular test. The false discovery rate is then treated as a function of this informative variable. We consider two applications in genomics. Our first is a genetics of gene expression (eQTL) experiment in yeast where every genetic marker and gene expression trait pair are tested for associations. The informative variable in this case is the distance between each genetic marker and gene. Our second application is to detect differentially expressed genes in an RNA-seq study carried out in mice. -
Richard Bradley, Decision Theory with a Human Face
Œconomia History, Methodology, Philosophy 9-1 | 2019 Varia Richard Bradley, Decision Theory with a Human Face Nicolas Gravel Electronic version URL: http://journals.openedition.org/oeconomia/5273 DOI: 10.4000/oeconomia.5273 ISSN: 2269-8450 Publisher Association Œconomia Printed version Date of publication: 1 March 2019 Number of pages: 149-160 ISSN: 2113-5207 Electronic reference Nicolas Gravel, « Richard Bradley, Decision Theory with a Human Face », Œconomia [Online], 9-1 | 2019, Online since 01 March 2019, connection on 29 December 2020. URL : http://journals.openedition.org/ oeconomia/5273 ; DOI : https://doi.org/10.4000/oeconomia.5273 Les contenus d’Œconomia sont mis à disposition selon les termes de la Licence Creative Commons Attribution - Pas d'Utilisation Commerciale - Pas de Modification 4.0 International. | Revue des livres/Book review 149 Comptes rendus / Reviews Richard Bradley, Decision Theory with a Human Face Cambridge: Cambridge University Press, 2017, 335 pages, ISBN 978-110700321-7 Nicolas Gravel∗ The very title of this book, borrowed from Richard Jeffrey’s “Bayesianism with a human face” (Jeffrey, 1983a) is a clear in- dication of its content. Just like its spiritual cousin The Foun- dations of Causal Decision Theory by James M. Joyce(2000), De- cision Theory with a Human Face provides a thoughtful descrip- tion of the current state of development of the Bolker-Jeffrey (BJ) approach to decision-making. While a full-fledged presen- tation of the BJ approach is beyond the scope of this review, it is difficult to appraise the content of Decision Theory with a Hu- man Face without some acquaintance with both the basics of the BJ approach to decision-making and its fitting in the large cor- pus of “conventional” decision theories that have developed in economics, mathematics and psychology since at least the publication of Von von Neumann and Morgenstern(1947). -
9<HTLDTH=Hebaaa>
springer.com/booksellers Springer News 11/12/2007 Statistics 49 D. R. Anderson, Colorado State University, Fort J. Franke, W. Härdle, C. M. Hafner U. B. Kjaerulff, Aalborg University, Aalborg, Denmark; Collins, CO, USA A. L. Madsen, HUGIN Expert A/S, Aalborg, Denmark Statistics of Financial Markets Model Based Inference in the An Introduction Bayesian Networks and Life Sciences Influence Diagrams: A Guide A Primer on Evidence to Construction and Analysis Statistics of Financial Markets offers a vivid yet concise introduction to the growing field of statis- The abstract concept of “information” can be tical applications in finance. The reader will learn Probabilistic networks, also known as Bayesian quantified and this has led to many important the basic methods to evaluate option contracts, to networks and influence diagrams, have become advances in the analysis of data in the empirical analyse financial time series, to select portfolios one of the most promising technologies in the area sciences. This text focuses on a science philosophy and manage risks making realistic assumptions of of applied artificial intelligence, offering intui- based on “multiple working hypotheses” and statis- the market behaviour. tive, efficient, and reliable methods for diagnosis, tical models to represent them. The fundamental The focus is both on fundamentals of math- prediction, decision making, classification, trou- science question relates to the empirical evidence ematical finance and financial time series analysis bleshooting, and data mining under uncertainty. for hypotheses in this set - a formal strength of and on applications to given problems of financial Bayesian Networks and Influence Diagrams: A evidence. Kullback-Leibler information is the markets, making the book the ideal basis for Guide to Construction and Analysis provides a information lost when a model is used to approxi- lectures, seminars and crash courses on the topic.