Statistical Decision Theory: Concepts, Methods and Applications

Total Page:16

File Type:pdf, Size:1020Kb

Statistical Decision Theory: Concepts, Methods and Applications Statistical Decision Theory: Concepts, Methods and Applications (Special topics in Probabilistic Graphical Models) FIRST COMPLETE DRAFT November 30, 2003 Supervisor: Professor J. Rosenthal STA4000Y Anjali Mazumder 950116380 Part I: Decision Theory – Concepts and Methods Part I: DECISION THEORY - Concepts and Methods Decision theory as the name would imply is concerned with the process of making decisions. The extension to statistical decision theory includes decision making in the presence of statistical knowledge which provides some information where there is uncertainty. The elements of decision theory are quite logical and even perhaps intuitive. The classical approach to decision theory facilitates the use of sample information in making inferences about the unknown quantities. Other relevant information includes that of the possible consequences which is quantified by loss and the prior information which arises from statistical investigation. The use of Bayesian analysis in statistical decision theory is natural. Their unification provides a foundational framework for building and solving decision problems. The basic ideas of decision theory and of decision theoretic methods lend themselves to a variety of applications and computational and analytic advances. This initial part of the report introduces the basic elements in (statistical) decision theory and reviews some of the basic concepts of both frequentist statistics and Bayesian analysis. This provides a foundational framework for developing the structure of decision problems. The second section presents the main concepts and key methods involved in decision theory. The last section of Part I extends this to statistical decision theory – that is, decision problems with some statistical knowledge about the unknown quantities. This provides a comprehensive overview of the decision theoretic framework. 1 Part I: Decision Theory – Concepts and Methods Section 1: An Overview of the Decision Framework: Concepts & Preliminaries Decision theory is concerned with the problem of making decisions. The term statistical decision theory pertains to decision making in the presence of statistical knowledge, by shedding light on some of the uncertainties involved in the problem. For most of this report, unless otherwise stated, it may be assumed that these uncertainties can be considered to be unknown numerical quantities, denoted by θ. Decision making under uncertainty draws on probability theory and graphical models. This report and more particularly this Part focuses on the methodology and mathematical and statistical concepts pertinent to statistical decision theory. This initial section presents the decisional framework and introduces the notation used to model decision problems. Section 1.1: Rationale A decision problem in itself is not complicated to comprehend or describe and can be simply summarized with a few basic elements. However, before proceeding any further, it is important to note that this report focuses on the rational decision or choice models based upon individual rationality. Models of strategic rationality (small-group behavior) or competitive rationality (market behavior) branch into areas of game theory and asset pricing theory, respectively. Thus for the purposes of this report, these latter models have been neglected as the interest of study is statistical decision theory based on individual rationality. “In a conventional rational choice model, individuals strive to satisfy their preferences for the consequences of their actions given their beliefs about events, which are represented by utility functions and probability distributions, and interactions among individuals are governed by equilibrium conditions” (Nau, 2002[1]). Decision models lend themselves to a decision making process which involves the consideration of the set of possible actions from which one must choose, the circumstances that prevail and the consequences that result from taking any given action. The optimal decision is to make a choice in such a way as to make the consequences as favorable as possible. As mentioned above, the uncertainty in decision making which is defined as an unknown quantity, θ, describing the combination of “prevailing circumstances and governing laws”, is referred to as the state of nature (Lindgren, 1971). If this state is unknown, it is simple to select the action according to the favorable degree of the consequences resulting from the various actions and the known state. However, in many real problems and those most pertinent to decision theory, the state of nature is not completely known. Since these situations create ambiguity and uncertainty, the consequences and subsequent results become complicated. Decision problems under uncertainty involve “many diverse ingredients” - loss or gain of money, security, satisfaction, etc., (Lindgren, 1971). Some of these “ingredients” can be assessed while some may be unknown. Nevertheless, in order to construct a mathematical framework in which to model decision problems, while providing a rational 2 Part I: Decision Theory – Concepts and Methods basis for making decisions, a numerical scale is assumed to measure consequences. Because monetary gain is often neither an adequate nor accurate measure of consequences, the notion of utility is introduced to quantify preferences among various prospects which a decision maker may be faced with. Usually something is known about the state of nature, allowing a consideration of a set of states as being admissible (or at least theoretically so), and thereby ruling out many that are not. It is sometimes possible to take measurements or conduct experiments in order to gain more information about the state. A decision process is referred to as “statistical” when experiments of chance related to the state of nature are performed. The results of such experiments are called data or observations. These provide a basis for the selection of an action defined as a statistical decision rule. To summarize, the “ingredients” of a decision problem include (a) a set of available actions, (b) a set of admissible states of nature, and (c) a loss associated with each combination of a state if nature and action. When only these make up the elements of a decision problem, the decision problem is referred to as the “no-data” or “without experimentation” decision problem. However, if (d) observations from an experiment defined by the state of nature are included with (a) to (c), then the decision problem is known as a statistical decision problem. This initial overview of the decision framework allows for a clear presentation of the mathematical and statistical concepts, notation and structure involved in decision modeling. Section 1.2 The Basic Elements The previous section summarized the basic elements of decision problems. For brevity purposes, this section will not repeat the description of the two types of decision models and simply state the mathematical structure associated with each element. It is assumed that a decision maker can specify the following basic elements of a decision problem. 1. Action Space: A = {a}. The single action is denoted by an a, while the set of all possible actions is denoted as A. It should be noted that the term actions is used in decision literature instead of decisions. However, they can be used somewhat interchangeably. Thus, a decision maker is to select a single action a ∈ Afrom a space of all possible actions. 2. State Space: Θ = {θ}. (or Parameter Space) The decision process is affected by the unknown quantity θ ∈ Θ which signifies the state of nature. The set of all possible states of nature is denoted by Θ. Thus, a decision maker perceives that a particular action a results in a corresponding state θ. 3. Consequence: C = {c}. 3 Part I: Decision Theory – Concepts and Methods The consequence of choosing a possible action and its state of nature may be multi- dimensional and can be mathematically stated as c(a,θ ) ∈ C . 4. Loss Function:l(a,θ ) ∈ A× Θ. The objectives of a decision maker are described as a real-valued loss functionl(a,θ ) , which measures the loss (or negative utility) of the consequencec(a,θ ) . 5. Family of Experiments: E = {e}. Typically experiments are performed to obtain further information about eachθ ∈ Θ . A single experiment is denoted by an e, while the set of all possible experiments is denoted as E. Thus, a decision maker may select a single experiment e from a family of potential experiments which can assist in determining the importance of possible actions or decisions. 6. Sample Space: X = {x}. An outcome of a potential experiment e ∈ E is denoted as x ∈ X . The importance of this outcome was explained in (3) and hence is not repeated here. However, it should be noted that when a statistical investigation (such as an experiment) is performed to obtain information about θ, the subsequent observed outcome x is a random variable. The set of all possible outcomes is the sample space while a particular realization of X is denoted as x. Notably, X is a subset ofℜn . 7. Decision Rule:δ (x) ∈ A . If a decision maker is to observe an outcome X = x and then choose a suitable action δ (x) ∈ A , then the result is to use the data to minimize the lossl(δ (x),θ ) . Sections 2 and 3 focus on discussing the appropriate measures of minimization in decision processes. 8. Utility Evaluation: u(⋅,⋅,⋅,⋅) on E × X × A× Θ . The quantification of a decision maker’s preferences is described by a utility function u(e, x,a,θ ) which is assigned to a particular conduct of e, a resulting observed x, choosing a particular action a, with a corresponding θ. The evaluation of the utility function u takes into account costs of an experiment as well as consequences of the specific action which may be monetary and/or of other forms. Section 1.3 Probability Measures Statistical decision theory is based on probability theory and utility theory. Focusing on the former, this sub-section presents the elementary probability theory used in decision processes. The probability distribution of a random variable, such as X, which is 4 Part I: Decision Theory – Concepts and Methods dependent on θ, as stated above, is denoted asPθ (E ) orPθ (X ∈ E ) where E is an event.
Recommended publications
  • Implications of Rational Inattention
    IMPLICATIONS OF RATIONAL INATTENTION CHRISTOPHER A. SIMS Abstract. A constraint that actions can depend on observations only through a communication channel with finite Shannon capacity is shown to be able to play a role very similar to that of a signal extraction problem or an adjustment cost in standard control problems. The resulting theory looks enough like familiar dynamic rational expectations theories to suggest that it might be useful and practical, while the implications for policy are different enough to be interesting. I. Introduction Keynes's seminal idea was to trace out the equilibrium implications of the hypothe- sis that markets did not function the way a seamless model of continuously optimizing agents, interacting in continuously clearing markets would suggest. His formal de- vice, price \stickiness", is still controversial, but those critics of it who fault it for being inconsistent with the assumption of continuously optimizing agents interacting in continuously clearing markets miss the point. This is its appeal, not its weakness. The influential competitors to Keynes's idea are those that provide us with some other description of the nature of the deviations from the seamless model that might account for important aspects of macroeconomic fluctuations. Lucas's 1973 clas- sic \International Evidence: : :" paper uses the idea that agents may face a signal- extraction problem in distinguishing movements in the aggregate level of prices and wages from movements in the specific prices they encounter in transactions. Much of subsequent rational expectations macroeconomic modeling has relied on the more tractable device of assuming an \information delay", so that some kinds of aggregate data are observable to some agents only with a delay, though without error after the delay.
    [Show full text]
  • The Physics of Optimal Decision Making: a Formal Analysis of Models of Performance in Two-Alternative Forced-Choice Tasks
    Psychological Review Copyright 2006 by the American Psychological Association 2006, Vol. 113, No. 4, 700–765 0033-295X/06/$12.00 DOI: 10.1037/0033-295X.113.4.700 The Physics of Optimal Decision Making: A Formal Analysis of Models of Performance in Two-Alternative Forced-Choice Tasks Rafal Bogacz, Eric Brown, Jeff Moehlis, Philip Holmes, and Jonathan D. Cohen Princeton University In this article, the authors consider optimal decision making in two-alternative forced-choice (TAFC) tasks. They begin by analyzing 6 models of TAFC decision making and show that all but one can be reduced to the drift diffusion model, implementing the statistically optimal algorithm (most accurate for a given speed or fastest for a given accuracy). They prove further that there is always an optimal trade-off between speed and accuracy that maximizes various reward functions, including reward rate (percentage of correct responses per unit time), as well as several other objective functions, including ones weighted for accuracy. They use these findings to address empirical data and make novel predictions about performance under optimality. Keywords: drift diffusion model, reward rate, optimal performance, speed–accuracy trade-off, perceptual choice This article concerns optimal strategies for decision making in It has been known since Hernstein’s (1961, 1997) work that the two-alternative forced-choice (TAFC) task. We present and animals do not achieve optimality under all conditions, and in compare several decision-making models, briefly discuss their behavioral economics, humans often fail to choose optimally (e.g., neural implementations, and relate them to one that is optimal in Kahneman & Tversky, 1984; Loewenstein & Thaler, 1989).
    [Show full text]
  • Statistical Decision Theory Bayesian and Quasi-Bayesian Estimators
    Statistical Decision Theory Bayesian and Quasi-Bayesian estimators Giselle Montamat Harvard University Spring 2020 Statistical Decision Theory Bayesian and Quasi-Bayesian estimators 1 / Giselle Montamat 46 Statistical Decision Theory Framework to make a decision based on data (e.g., find the \best" estimator under some criteria for what \best" means; decide whether to retain/reward a teacher based on observed teacher value added estimates); criteria to decide what a good decision (e.g., a good estimator; whether to retain/reward a teacher) is. Ingredients: Data: \X " Statistical decision: \a" Decision function: \δ(X )" State of the world: \θ" Loss function: \L(a; θ)" Statistical model (likelihood): \f (X jθ)" Risk function (aka expected loss): Z R(δ; θ) = Ef (X jθ)[L(δ(X ); θ)] = L(δ(X ); θ)f (X jθ)dX Statistical Decision Theory Bayesian and Quasi-Bayesian estimators 2 / Giselle Montamat 46 Statistical Decision Theory Objective: estimate µ(θ) (could be µ(θ) = θ) using data X via δ(X ). (Note: here the decision is to choose an estimator; we'll see another example where the decision is a binary choice). Loss function L(a; θ): describes loss that we incur in if we take action a when true parameter value is θ. Note that estimation (\decision") will be based on data via δ(X ) = a, so loss is a function of the data and the true parameter, ie, L(δ(X ); θ). Criteria for what makes a \good" δ(X ), for a given θ: the expected loss (aka, the risk) has to be small, where the expectation is taken over X given model f (X jθ) for a given θ.
    [Show full text]
  • Prisoners of Reason Game Theory and Neoliberal Political Economy
    C:/ITOOLS/WMS/CUP-NEW/6549131/WORKINGFOLDER/AMADAE/9781107064034PRE.3D iii [1–28] 11.8.2015 9:57PM Prisoners of Reason Game Theory and Neoliberal Political Economy S. M. AMADAE Massachusetts Institute of Technology C:/ITOOLS/WMS/CUP-NEW/6549131/WORKINGFOLDER/AMADAE/9781107064034PRE.3D iv [1–28] 11.8.2015 9:57PM 32 Avenue of the Americas, New York, ny 10013-2473, usa Cambridge University Press is part of the University of Cambridge. It furthers the University’s mission by disseminating knowledge in the pursuit of education, learning, and research at the highest international levels of excellence. www.cambridge.org Information on this title: www.cambridge.org/9781107671195 © S. M. Amadae 2015 This publication is in copyright. Subject to statutory exception and to the provisions of relevant collective licensing agreements, no reproduction of any part may take place without the written permission of Cambridge University Press. First published 2015 Printed in the United States of America A catalog record for this publication is available from the British Library. Library of Congress Cataloging in Publication Data Amadae, S. M., author. Prisoners of reason : game theory and neoliberal political economy / S.M. Amadae. pages cm Includes bibliographical references and index. isbn 978-1-107-06403-4 (hbk. : alk. paper) – isbn 978-1-107-67119-5 (pbk. : alk. paper) 1. Game theory – Political aspects. 2. International relations. 3. Neoliberalism. 4. Social choice – Political aspects. 5. Political science – Philosophy. I. Title. hb144.a43 2015 320.01′5193 – dc23 2015020954 isbn 978-1-107-06403-4 Hardback isbn 978-1-107-67119-5 Paperback Cambridge University Press has no responsibility for the persistence or accuracy of URLs for external or third-party Internet Web sites referred to in this publication and does not guarantee that any content on such Web sites is, or will remain, accurate or appropriate.
    [Show full text]
  • The Functional False Discovery Rate with Applications to Genomics
    bioRxiv preprint doi: https://doi.org/10.1101/241133; this version posted December 30, 2017. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-ND 4.0 International license. The Functional False Discovery Rate with Applications to Genomics Xiongzhi Chen*,† David G. Robinson,* and John D. Storey*‡ December 29, 2017 Abstract The false discovery rate measures the proportion of false discoveries among a set of hypothesis tests called significant. This quantity is typically estimated based on p-values or test statistics. In some scenarios, there is additional information available that may be used to more accurately estimate the false discovery rate. We develop a new framework for formulating and estimating false discovery rates and q-values when an additional piece of information, which we call an “informative variable”, is available. For a given test, the informative variable provides information about the prior probability a null hypothesis is true or the power of that particular test. The false discovery rate is then treated as a function of this informative variable. We consider two applications in genomics. Our first is a genetics of gene expression (eQTL) experiment in yeast where every genetic marker and gene expression trait pair are tested for associations. The informative variable in this case is the distance between each genetic marker and gene. Our second application is to detect differentially expressed genes in an RNA-seq study carried out in mice.
    [Show full text]
  • Optimal Trees for Prediction and Prescription Jack William Dunn
    Optimal Trees for Prediction and Prescription by Jack William Dunn B.E.(Hons), University of Auckland (2014) Submitted to the Sloan School of Management in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Operations Research at the MASSACHUSETTS INSTITUTE OF TECHNOLOGY June 2018 ○c Massachusetts Institute of Technology 2018. All rights reserved. Author................................................................ Sloan School of Management May 18, 2018 Certified by. Dimitris Bertsimas Boeing Professor of Operations Research Co-director, Operations Research Center Thesis Supervisor Accepted by . Patrick Jaillet Dugald C. Jackson Professor Department of Electrical Engineering and Computer Science Co-Director, Operations Research Center Optimal Trees for Prediction and Prescription by Jack William Dunn Submitted to the Sloan School of Management on May 18, 2018, in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Operations Research Abstract For the past 30 years, decision tree methods have been one of the most widely- used approaches in machine learning across industry and academia, due in large part to their interpretability. However, this interpretability comes at a price—the performance of classical decision tree methods is typically not competitive with state- of-the-art methods like random forests and gradient boosted trees. A key limitation of classical decision tree methods is their use of a greedy heuristic for training. The tree is therefore constructed one locally-optimal split at a time, and so the final tree as a whole may be far from global optimality. Motivated bythe increase in performance of mixed-integer optimization methods over the last 30 years, we formulate the problem of constructing the optimal decision tree using discrete optimization, allowing us to construct the entire decision tree in a single step and hence find the single tree that best minimizes the training error.
    [Show full text]
  • Richard Bradley, Decision Theory with a Human Face
    Œconomia History, Methodology, Philosophy 9-1 | 2019 Varia Richard Bradley, Decision Theory with a Human Face Nicolas Gravel Electronic version URL: http://journals.openedition.org/oeconomia/5273 DOI: 10.4000/oeconomia.5273 ISSN: 2269-8450 Publisher Association Œconomia Printed version Date of publication: 1 March 2019 Number of pages: 149-160 ISSN: 2113-5207 Electronic reference Nicolas Gravel, « Richard Bradley, Decision Theory with a Human Face », Œconomia [Online], 9-1 | 2019, Online since 01 March 2019, connection on 29 December 2020. URL : http://journals.openedition.org/ oeconomia/5273 ; DOI : https://doi.org/10.4000/oeconomia.5273 Les contenus d’Œconomia sont mis à disposition selon les termes de la Licence Creative Commons Attribution - Pas d'Utilisation Commerciale - Pas de Modification 4.0 International. | Revue des livres/Book review 149 Comptes rendus / Reviews Richard Bradley, Decision Theory with a Human Face Cambridge: Cambridge University Press, 2017, 335 pages, ISBN 978-110700321-7 Nicolas Gravel∗ The very title of this book, borrowed from Richard Jeffrey’s “Bayesianism with a human face” (Jeffrey, 1983a) is a clear in- dication of its content. Just like its spiritual cousin The Foun- dations of Causal Decision Theory by James M. Joyce(2000), De- cision Theory with a Human Face provides a thoughtful descrip- tion of the current state of development of the Bolker-Jeffrey (BJ) approach to decision-making. While a full-fledged presen- tation of the BJ approach is beyond the scope of this review, it is difficult to appraise the content of Decision Theory with a Hu- man Face without some acquaintance with both the basics of the BJ approach to decision-making and its fitting in the large cor- pus of “conventional” decision theories that have developed in economics, mathematics and psychology since at least the publication of Von von Neumann and Morgenstern(1947).
    [Show full text]
  • Rational Inattention in Controlled Markov Processes
    Rational Inattention in Controlled Markov Processes Ehsan Shafieepoorfard, Maxim Raginsky and Sean P. Meyn Abstract— The paper poses a general model for optimal Sims considers a model in which a representative agent control subject to information constraints, motivated in part decides about his consumption over subsequent periods of by recent work on information-constrained decision-making by time, while his computational ability to reckon his wealth – economic agents. In the average-cost optimal control frame- work, the general model introduced in this paper reduces the state of the dynamic system – is limited. A special case is to a variant of the linear-programming representation of the considered in which income in one period adds uncertainty average-cost optimal control problem, subject to an additional of wealth in the next period. Other modeling assumptions mutual information constraint on the randomized stationary reduce the model to an LQG control problem. As one policy. The resulting optimization problem is convex and admits justification for introducing the information constraint, Sims a decomposition based on the Bellman error, which is the object of study in approximate dynamic programming. The remarks [7] that “most people are only vaguely aware of their structural results presented in this paper can be used to obtain net worth, are little-influenced in their current behavior by performance bounds, as well as algorithms for computation or the status of their retirement account, and can be induced to approximation of optimal policies. make large changes in savings behavior by minor ‘informa- tional’ changes, like changes in default options on retirement I. INTRODUCTION plans.” Quantitatively, the information constraint is stated in terms of an upper bound on the mutual information in the In typical applications of stochastic dynamic program- sense of Shannon [8] between the state of the system and ming, the controller has access to limited information about the observation available to the agent.
    [Show full text]
  • 9<HTLDTH=Hebaaa>
    springer.com/booksellers Springer News 11/12/2007 Statistics 49 D. R. Anderson, Colorado State University, Fort J. Franke, W. Härdle, C. M. Hafner U. B. Kjaerulff, Aalborg University, Aalborg, Denmark; Collins, CO, USA A. L. Madsen, HUGIN Expert A/S, Aalborg, Denmark Statistics of Financial Markets Model Based Inference in the An Introduction Bayesian Networks and Life Sciences Influence Diagrams: A Guide A Primer on Evidence to Construction and Analysis Statistics of Financial Markets offers a vivid yet concise introduction to the growing field of statis- The abstract concept of “information” can be tical applications in finance. The reader will learn Probabilistic networks, also known as Bayesian quantified and this has led to many important the basic methods to evaluate option contracts, to networks and influence diagrams, have become advances in the analysis of data in the empirical analyse financial time series, to select portfolios one of the most promising technologies in the area sciences. This text focuses on a science philosophy and manage risks making realistic assumptions of of applied artificial intelligence, offering intui- based on “multiple working hypotheses” and statis- the market behaviour. tive, efficient, and reliable methods for diagnosis, tical models to represent them. The fundamental The focus is both on fundamentals of math- prediction, decision making, classification, trou- science question relates to the empirical evidence ematical finance and financial time series analysis bleshooting, and data mining under uncertainty. for hypotheses in this set - a formal strength of and on applications to given problems of financial Bayesian Networks and Influence Diagrams: A evidence. Kullback-Leibler information is the markets, making the book the ideal basis for Guide to Construction and Analysis provides a information lost when a model is used to approxi- lectures, seminars and crash courses on the topic.
    [Show full text]
  • Theories of International Relations* Ole R. Holsti
    Theories of International Relations* Ole R. Holsti Universities and professional associations usually are organized in ways that tend to separate scholars in adjoining disciplines and perhaps even to promote stereotypes of each other and their scholarly endeavors. The seemingly natural areas of scholarly convergence between diplomatic historians and political scientists who focus on international relations have been underexploited, but there are also some signs that this may be changing. These include recent essays suggesting ways in which the two disciplines can contribute to each other; a number of prizewinning dissertations, later turned into books, by political scientists that effectively combine political science theories and historical materials; collaborative efforts among scholars in the two disciplines; interdisciplinary journals such as International Security that provide an outlet for historians and political scientists with common interests; and creation of a new section, “International History and Politics,” within the American Political Science Association.1 *The author has greatly benefited from helpful comments on earlier versions of this essay by Peter Feaver, Alexander George, Joseph Grieco, Michael Hogan, Kal Holsti, Bob Keohane, Timothy Lomperis, Roy Melbourne, James Rosenau, and Andrew Scott, and also from reading 1 K. J. Holsti, The Dividing Discipline: Hegemony and Diversity in International Theory (London, 1985). This essay is an effort to contribute further to an exchange of ideas between the two disciplines by describing some of the theories, approaches, and "models" political scientists have used in their research on international relations during recent decades. A brief essay cannot do justice to the entire range of theoretical approaches that may be found in the current literature, but perhaps those described here, when combined with citations of some representative works, will provide diplomatic historians with a useful, if sketchy, map showing some of the more prominent landmarks in a neighboring discipline.
    [Show full text]
  • Improving Monetary Policy Models by Christopher A. Sims, Princeton
    Improving Monetary Policy Models by Christopher A. Sims, Princeton University and NBER CEPS Working Paper No. 128 May 2006 IMPROVING MONETARY POLICY MODELS CHRISTOPHER A. SIMS ABSTRACT. If macroeconomic models are to be useful in policy-making, where un- certainty is pervasive, the models must be treated as probability models, whether formally or informally. Use of explicit probability models allows us to learn sys- tematically from past mistakes, to integrate model-based uncertainty with uncertain subjective judgment, and to bind data-bassed forecasting together with theory-based projection of policy effects. Yet in the last few decades policy models at central banks have steadily shed any claims to being believable probability models of the data to which they are fit. Here we describe the current state of policy modeling, suggest some reasons why we have reached this state, and assess some promising directions for future progress. I. WHY DO WE NEED PROBABILITY MODELS? Fifty years ago most economists thought that Tinbergen’s original approach to macro-modeling, which consisted of fitting many equations by single-equation OLS Date: May 26, 2006. °c 2006 by Christopher A. Sims. This material may be reproduced for educational and research purposes so long as the copies are not sold, even to recover costs, the document is not altered, and this copyright notice is included in the copies. Research for this paper was supported by NSF grant SES 0350686 and by Princeton’s Center for Economic Policy Studies. This paper was presented at a December 2-3, 2005 conference at the Board of Governors of the Federal Reserve and may appear in a conference volume or journal special issue.
    [Show full text]
  • Introduction to Decision Theory: Econ 260 Spring, 2016
    Introduction to Decision Theory: Econ 260 Spring, 2016 1/7. Instructor Details: Sumantra Sen Email: [email protected] OH: TBA TA: Ross Askanazi OH of TA: TBA 2/7. Website: courseweb.library.upenn.edu (Canvas Course site) 3/7. Overview: This course will provide an introduction to Decision Theory. The goal of the course is to make the student think rigorously about decision-making in contexts of objective and subjective uncertainty. The heart of the class is delving deeply into the axiomatic tradition of Decision Theory in Economics, and covering the classic theorems of De Finetti, Von Neumann & Morgenstern, and Savage and reevaluating them in light of recent research in behavioral economics and Neuro-economics. Time permitting; we will study other extensions (dynamic models and variants of other static models) as well, with the discussion anchored by comparison and reference to these classic models. There is no required text for the class, as there is no suitable one for undergraduates that covers the material we plan to do. We will reply on lecture notes and references (see below). 4/7. Policies of the Economics Department, including course pre-requisites and academic integrity Issues may be found at: http://economics.sas.upenn.edu/undergraduate-program/course- information/guidelines/policies 5/7. Grades: There will be two non-cumulative in-class midterms (20% each) during the course and one cumulative final (40%). The midterms will be on Feb 11 (will be graded and returned by Feb 18) and April 5th. The final is on May 6th. If exams are missed, documentable reasons of the validity of such actions must be provided within a reasonable period of time and the instructor will have the choice of pro-rating your grade over the missed exam rather than deliver a make-up.
    [Show full text]