1999 Annual Report
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Computational Learning Theory: New Models and Algorithms
Computational Learning Theory: New Models and Algorithms by Robert Hal Sloan S.M. EECS, Massachusetts Institute of Technology (1986) B.S. Mathematics, Yale University (1983) Submitted to the Department- of Electrical Engineering and Computer Science in partial fulfillment of the requirements for the degree of Doctor of Philosophy at the MASSACHUSETTS INSTITUTE OF TECHNOLOGY June 1989 @ Robert Hal Sloan, 1989. All rights reserved The author hereby grants to MIT permission to reproduce and to distribute copies of this thesis document in whole or in part. Signature of Author Department of Electrical Engineering and Computer Science May 23, 1989 Certified by Ronald L. Rivest Professor of Computer Science Thesis Supervisor Accepted by Arthur C. Smith Chairman, Departmental Committee on Graduate Students Abstract In the past several years, there has been a surge of interest in computational learning theory-the formal (as opposed to empirical) study of learning algorithms. One major cause for this interest was the model of probably approximately correct learning, or pac learning, introduced by Valiant in 1984. This thesis begins by presenting a new learning algorithm for a particular problem within that model: learning submodules of the free Z-module Zk. We prove that this algorithm achieves probable approximate correctness, and indeed, that it is within a log log factor of optimal in a related, but more stringent model of learning, on-line mistake bounded learning. We then proceed to examine the influence of noisy data on pac learning algorithms in general. Previously it has been shown that it is possible to tolerate large amounts of random classification noise, but only a very small amount of a very malicious sort of noise. -
Freakonomics: a Rogue Economist Explores the Hidden Side of Everything
Freakonomics: A Rogue Economist Explores the Hidden Side of Everything By Steven D. Levitt and Stephen J. Dubner New York City, William Morrow 2005 242pp, $51.95, ISBN 0 06 073132X The University of Chicago has a tradition of producing economists that push the boundaries of the discipline. This is the school that nurtured innovative and iconoclastic thinkers such as Milton Friedman (on macroeconomic stabilisation policy), Robert Lucas (on the use of ‘rational’ rather than Keynesian ‘adaptive’ expectations) and Gary Becker (on the economics of the family). Steven Levitt and his journalist co-author, Stephen Dubner, appear to be following this tradition by publishing a book that deals with a range of topics that do not fall within the realm of mainstream economics—hence the book’s title. Freakonomics can best be described as a sort of Bill Bryson guide to applied social science. It considers a number of disparate behavioural phenomena and seeks to make sense of them, sometimes drawing on economic hypotheses but always relying heavily on data. It is the emphasis on data that is the book’s strongest feature. Levitt and Dubner discuss innovative techniques for identifying match-rigging by sumo wrestlers, cheating by Chicago school teachers, the impact of parents on children’s school test scores and other diverse topics. The approach used to test sumo match-rigging is especially fascinating and is compulsory reading for anyone planning to sit a graduate job interview for McKinsey. Levitt and Dubner explain that the top wrestlers compete in tournaments six times a year. Each tournament comprises fifteen matches. -
Security Analysis of Cryptographically Controlled Access to XML Documents
Security Analysis of Cryptographically Controlled Access to XML Documents ¤ Mart´in Abadi Bogdan Warinschi Computer Science Department Computer Science Department University of California at Santa Cruz Stanford University [email protected] [email protected] ABSTRACT ments [4, 5, 7, 8, 14, 19, 23]. This line of research has led to Some promising recent schemes for XML access control em- e±cient and elegant publication techniques that avoid data ploy encryption for implementing security policies on pub- duplication by relying on cryptography. For instance, us- lished data, avoiding data duplication. In this paper we ing those techniques, medical records may be published as study one such scheme, due to Miklau and Suciu. That XML documents, with parts encrypted in such a way that scheme was introduced with some intuitive explanations and only the appropriate users (physicians, nurses, researchers, goals, but without precise de¯nitions and guarantees for the administrators, and patients) can see their contents. use of cryptography (speci¯cally, symmetric encryption and The work of Miklau and Suciu [19] is a crisp, compelling secret sharing). We bridge this gap in the present work. We example of this line of research. They develop a policy query analyze the scheme in the context of the rigorous models language for specifying ¯ne-grained access policies on XML of modern cryptography. We obtain formal results in sim- documents and a logical model based on the concept of \pro- ple, symbolic terms close to the vocabulary of Miklau and tection". They also show how to translate consistent poli- Suciu. We also obtain more detailed computational results cies into protections, and how to implement protections by that establish security against probabilistic polynomial-time XML encryption [10]. -
The Death of the Firm
Article The Death of the Firm June Carbone† & Nancy Levit†† INTRODUCTION A corporation is simply a form of organization used by human beings to achieve desired ends. An established body of law specifies the rights and obligations of the people (including shareholders, officers, and employees) who are associated with a corporation in one way or another. When rights, whether constitutional or statutory, are ex- tended to corporations, the purpose is to protect the rights of these people.1 In the Supreme Court’s decision in Burwell v. Hobby Lob- by—and more generally in corporate and employment law—the firm as entity is disappearing as a unit of legal analysis. We use the term “firm” in this Article in the sense that Ronald Coase did to describe a form of business organization that or- ders the production of goods and services through use of a sys- tem internal to the enterprise rather than through the use of independent contractors.2 The idea of an “entity” in this sense † Robina Chair in Law, Science and Technology, University of Minneso- ta Law School. †† Curators’ and Edward D. Ellison Professor of Law, University of Mis- souri – Kansas City School of Law. We thank William K. Black, Margaret F. Brinig, Naomi Cahn, Paul Callister, Mary Ann Case, Lynne Dallas, Robert Downs, Max Eichner, Martha Fineman, Barb Glesner Fines, Claire Hill, Brett McDonnell, Amy Monahan, Charles O’Kelley, Hari Osofsky, Irma Russell, Dan Schwarcz, Lynn Stout, and Erik P.M. Vermeulen for their helpful comments on drafts of this Article and Tracy Shoberg and Shiveta Vaid for their research support. -
Measuring the Impact of Crack Cocaine
NBER WORKING PAPER SERIES MEASURING THE IMPACT OF CRACK COCAINE Roland G. Fryer, Jr. Paul S. Heaton Steven D. Levitt Kevin M. Murphy Working Paper 11318 http://www.nber.org/papers/w11318 NATIONAL BUREAU OF ECONOMIC RESEARCH 1050 Massachusetts Avenue Cambridge, MA 02138 May 2005 We would like to thank Jonathan Caulkins, John Donohue, Lawrence Katz, Glenn Loury, Derek Neal, Bruce Sacerdote, Sudhir Venkatesh, and Ebonya Washington for helpful discussions on this topic. Elizabeth Coston and Rachel Tay provided exceptional research assistance. We gratefully acknowledge the financial support of Sherman Shapiro, the American Bar Foundation, and the National Science Foundation. The views expressed herein are those of the author(s) and do not necessarily reflect the views of the National Bureau of Economic Research. ©2005 by Roland G. Fryer, Paul S. Heaton, Steven D. Levitt, and Kevin M. Murphy. All rights reserved. Short sections of text, not to exceed two paragraphs, may be quoted without explicit permission provided that full credit, including © notice, is given to the source. Measuring the Impact of Crack Cocaine Roland G. Fryer, Paul S. Heaton, Steven D. Levitt, and Kevin M. Murphy NBER Working Paper No. 11318 May 2005 JEL No. J00 ABSTRACT A wide range of social indicators turned sharply negative for Blacks in the late 1980s and began to rebound roughly a decade later. We explore whether the rise and fall of crack cocaine can explain these patterns. Absent a direct measure of crack cocaine’s prevalence, we construct an index based on a range of indirect proxies (cocaine arrests, cocaine-related emergency room visits, cocaine- induced drug deaths, crack mentions in newspapers, and DEA drug busts). -
ITERATIVE ALGOR ITHMS for GLOBAL FLOW ANALYSIS By
ITERATIVE ALGOR ITHMS FOR GLOBAL FLOW ANALYSIS bY Robert Endre Tarjan STAN-CS-76-547 MARCH 1976 COMPUTER SCIENCE DEPARTMENT School of Humanities and Sciences STANFORD UN IVERS ITY Iterative Algorithms for Global Flow Analysis * Robert Endre Tarjan f Computer Science Department Stanford University Stanford, California 94305 February 1976 Abstract. This paper studies iterative methods for the global flow analysis of computer programs. We define a hierarchy of global flow problem classes, each solvable by an appropriate generalization of the "node listing" method of Kennedy. We show that each of these generalized methods is optimum, among all iterative algorithms, for solving problems within its class. We give lower bounds on the time required by iterative algorithms for each of the problem classes. Keywords: computational complexity, flow graph reducibility, global flow analysis, graph theory, iterative algorithm, lower time bound, node listing. * f Research partially supported by National Science Foundation grant MM 75-22870. 1 t 1. Introduction. A problem extensively studied in recent years [2,3,5,7,8,9,12,13,14, 15,2'7,28,29,30] is that of globally analyzing cmputer programs; that is, collecting information which is distributed throughout a computer program, generally for the purpose of optimizing the program. Roughly speaking, * global flow analysis requires the determination, for each program block f , of a property known to hold on entry to the block, independent of the path taken to reach the block. * A widely used amroach to global flow analysis is to model the set of possible properties by a semi-lattice (we desire the 'lmaximumtl property for each block), to model the control structure of the program by a directed graph with one vertex for each program block, and to specify, for each branch from block to block, the function by which that branch transforms the set of properties. -
Chicago Workshop on Black–White Inequality: a Summary by Derek A
ESSAYS ON ISSUES THE FEDERAL RESERVE BANK APRIL 2007 OF CHICAGO NUMBER 237a Chicago Fed Letter Chicago Workshop on Black–White Inequality: A summary by Derek A. Neal, professor of economics, University of Chicago, and consultant, Federal Reserve Bank of Chicago The Chicago Workshop on Black–White Inequality, funded by the Searle Freedom Trust, meets on a semiannual basis to explore the causes and consequences of economic inequality between blacks and whites in the U.S. On December 15, 2006, the second meeting of the workshop was hosted by the Federal Reserve Bank of Chicago. During most of the twentieth century, looked at the effects of several govern- each successive generation of black ment policies on skill development in Americans came closer than their pre- children, as well as the measurement decessors to matching the educational challenges faced by social scientists who achievement and economic success of seek to quantify changes in the black– their white peers. However, the conver- white skill gap. gence in skills among children and in labor market success among adults stalled Education policy around 1990. The Chicago Workshop Education policy is at the forefront of Materials presented at the on Black–White Inequality is an effort many discussions of inequality in the to explore the reasons for the recent United States. The ongoing debate sur- December 15, 2006, meeting lack of progress for blacks relative to rounding the No Child Left Behind Act of the workshop are available whites. Workshop meetings focus par- of 2001 (NCLB) highlights the public at http://economics.uchicago. ticular attention on the black–white perception that disadvantaged commu- edu/Inequality_Workshop/ gaps in reading, math, and other basic nities, especially disadvantaged minority skills that appear to play such a large communities, receive poor service from Schedule_dec15.shtml. -
THE WHITE/BLACK EDUCATIONAL GAP, STALLED PROGRESS, and the LONG-TERM CONSEQUENCES of the EMERGENCE of CRACK COCAINE MARKETS William N
THE WHITE/BLACK EDUCATIONAL GAP, STALLED PROGRESS, AND THE LONG-TERM CONSEQUENCES OF THE EMERGENCE OF CRACK COCAINE MARKETS William N. Evans, Craig Garthwaite, and Timothy J. Moore* Abstract—We propose the rise of crack cocaine markets as a key explana- Figure 1A contains the race-specific high school completion tion for the end to the convergence in black-white educational outcomes in the United States that began in the mid-1980s. After constructing a rates for cohorts turning 18 between 1965 and 1997, while measure to date the arrival of crack markets in cities and states, we show figure 1B shows the white-black difference in these rates. that the decline in educational outcomes for black males begins with the Several features of the data are remarkable. First, the gap in start of the crack epidemic. We also show that there are higher murder and incarceration rates after the arrival of crack cocaine and that these are completion rates fell by 37% between 1965 and 1986, predictive of lower black high school completion rates, a result consistent decreasing from 15.3 to 9.6 percentage points. Second, with human capital theory. We estimate that effects related to crack mar- almost all of the convergence is due to rising black achieve- kets can account for approximately 40% to 70% of the fall in black male high school completion rates. ment—the completion rates of whites changed little over this period. Third, the convergence ends around 1986, and white- I. Introduction black completion rates diverge until 1997, when the gap was 14.4 percentage points. -
Guns and Violence: the Enduring Impact of Crack Cocaine Markets on Young Black Males*
GUNS AND VIOLENCE: THE ENDURING IMPACT OF CRACK COCAINE MARKETS ON YOUNG BLACK MALES* WILLIAM N. EVANS CRAIG GARTHWAITE TIMOTHY J. MOORE UNIVERSITY OF NOTRE DAME NORTHWESTERN UNIVERSITY PURDUE UNIVERSITY AND NBER, AND J-PAL AND NBER AND NBER December 2018 Abstract The emergence of crack cocaine markets in the 1980s brought a wave of violence to US cities that has repercussions today. Using cross-city variation in crack’s arrival and the experience of older cohorts, we estimate that the murder rate of young black males doubled soon after these markets were established, and that their rate was still 70 percent higher 17 years later. Using the fraction of gun-related suicides as a proxy for gun availability, we find that access to guns explains both the rise in young black males' murder rates after crack’s arrival and their elevated murder rates today. The authors wish to thank Michael Smith and Timothy Seida for excellent research assistance, and for seminar participants at Auburn University, Harvard University, Northwestern University, University of Georgia, University of Nebraska, Vanderbilt University, and University of Notre Dame for a number of helpful suggestions. Moore acknowledges financial support from an Australian Research Council fellowship (Discovery Early Career Research Award DE170100608). 0 1. Introduction Compared to other developed countries, the United States stands alone in its shockingly high murder rate. In 2015, the U.S. murder rate was 5.5 murders per 100,000 people, which is more than three times the rate of France, Canada, and the United Kingdom, and more than five times the rate of Italy, Germany and Spain.1 Despite being high relative to other developed countries, the U.S. -
Lionel Robbins on Economic Generalizations and Reality in the Light of Modern Econometrics
Economica (2009) 76, 873–890 doi:10.1111/j.1468-0335.2009.00805.x Robbins on Economic Generalizations and Reality in the Light of Modern Econometrics By ROGER E. BACKHOUSEw and STEVEN N. DURLAUFz wUniversity of Birmingham and Erasmus University Rotterdam zUniversity of Wisconsin–Madison Final version received 27 February 2009. This paper examines Lionel Robbins’ critical attitude towards formal empirical work from the standpoint of modern econometrics. It argues that his attitude towards empirical work rested on indefensible assumptions and that he failed to realise that the role he saw for empirical work undermined his belief in the primacy of economic theory. This matters because Robbins’ attitudes are echoed in modern economics, best exemplified by the calibration methodology of Kydland and Prescott, which is vulnerable to similar criticisms. INTRODUCTION Economists take widely different positions on the role that empirical evidence should play in the development of substantive propositions. Contemporary macroeconomics is split between those who give primacy to theory, whether they employ calibration methods to link theory to data (Edward Prescott) or regard theory and econometrics as complementary (Lars Hansen and Thomas Sargent), and those who regard atheoretical analyses as fundamental (Christopher Sims).1 Similar disagreements exist within microeconomics. On the microeconometrics side, one finds competing paradigms between James Heckman, whose work focuses on the use of economic assumptions to account for self-selection and heterogeneity, those who adopt the essentially statistical assumptions employed in Donald Rubin’s causal model, and Charles Manski’s emphasis on asking what may be learned under minimal assumptions of any type. Other microeconomic research, for example Steven Levitt’s work on crime, focuses on the identification of natural experiments, so that econometric analysis is reduced to a trivial role. -
Second International Computer Programming Education Conference
Second International Computer Programming Education Conference ICPEC 2021, May 27–28, 2021, University of Minho, Braga, Portugal Edited by Pedro Rangel Henriques Filipe Portela Ricardo Queirós Alberto Simões OA S I c s – Vo l . 91 – ICPEC 2021 www.dagstuhl.de/oasics Editors Pedro Rangel Henriques Universidade do Minho, Portugal [email protected] Filipe Portela Universidade do Minho, Portugal [email protected] Ricardo Queirós Politécnico do Porto, Portugal [email protected] Alberto Simões Politécnico do Cávado e Ave, Portugal [email protected] ACM Classifcation 2012 Applied computing → Education ISBN 978-3-95977-194-8 Published online and open access by Schloss Dagstuhl – Leibniz-Zentrum für Informatik GmbH, Dagstuhl Publishing, Saarbrücken/Wadern, Germany. Online available at https://www.dagstuhl.de/dagpub/978-3-95977-194-8. Publication date July, 2021 Bibliographic information published by the Deutsche Nationalbibliothek The Deutsche Nationalbibliothek lists this publication in the Deutsche Nationalbibliografe; detailed bibliographic data are available in the Internet at https://portal.dnb.de. License This work is licensed under a Creative Commons Attribution 4.0 International license (CC-BY 4.0): https://creativecommons.org/licenses/by/4.0/legalcode. In brief, this license authorizes each and everybody to share (to copy, distribute and transmit) the work under the following conditions, without impairing or restricting the authors’ moral rights: Attribution: The work must be attributed to its authors. The copyright is retained by the corresponding authors. Digital Object Identifer: 10.4230/OASIcs.ICPEC.2021.0 ISBN 978-3-95977-194-8 ISSN 1868-8969 https://www.dagstuhl.de/oasics 0:iii OASIcs – OpenAccess Series in Informatics OASIcs is a series of high-quality conference proceedings across all felds in informatics. -
A Auxiliary Definitions
A Auxiliary Definitions This appendix contains auxiliary definitions omitted from the main text. Variables fun lvars :: com ⇒ vname set where lvars SKIP = {} lvars (x ::= e)={x} lvars (c1;; c2)=lvars c1 ∪ lvars c2 lvars (IF b THEN c1 ELSE c2)=lvars c1 ∪ lvars c2 lvars (WHILE b DO c)=lvars c fun rvars :: com ⇒ vname set where rvars SKIP = {} rvars (x ::= e)=vars e rvars (c1;; c2)=rvars c1 ∪ rvars c2 rvars (IF b THEN c1 ELSE c2)=vars b ∪ rvars c1 ∪ rvars c2 rvars (WHILE b DO c)=vars b ∪ rvars c definition vars :: com ⇒ vname set where vars c = lvars c ∪ rvars c Abstract Interpretation fun strip :: aacom⇒ com where strip (SKIP {P})=SKIP strip (x ::= e {P})=x ::= e © Springer International Publishing Switzerland 2014 281 T. Nipkow and G. Klein, Concrete Semantics, DOI 10.1007/978-3-319-10542-0 282 A Auxiliary Definitions strip (C 1;;C 2)=strip C 1;; strip C 2 strip (IF b THEN {P 1} C 1 ELSE {P 2} C 2 {P})= IF b THEN strip C 1 ELSE strip C 2 strip ({I } WHILE b DO {P} C {Q})=WHILE b DO strip C fun annos :: aacom⇒ a list where annos (SKIP {P})=[P] annos (x ::= e {P})=[P] annos (C 1;;C 2)=annos C 1 @ annos C 2 annos (IF b THEN {P 1} C 1 ELSE {P 2} C 2 {Q})= P 1 # annos C 1 @ P 2 # annos C 2 @ [Q] annos ({I } WHILE b DO {P} C {Q})=I # P # annos C @ [Q] fun asize :: com ⇒ nat where asize SKIP = 1 asize (x ::= e)=1 asize (C 1;;C 2)=asize C 1 + asize C 2 asize (IF b THEN C 1 ELSE C 2)=asize C 1 + asize C 2 + 3 asize (WHILE b DO C )=asize C + 3 definition shift :: (nat ⇒ a) ⇒ nat ⇒ nat ⇒ a where shift f n =(λp.