Patterns, Predictions, and Actions
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
The Smoothed Possibility of Social Choice
The Smoothed Possibility of Social Choice Lirong Xia, RPI, [email protected] Abstract We develop a framework that leverages the smoothed complexity analysis by Spielman and Teng [60] to circumvent paradoxes and impossibility theorems in social choice, motivated by modern applications of social choice powered by AI and ML. For Condrocet’s paradox, we prove that the smoothed likelihood of the paradox either vanishes at an exponential rate as the number of agents in- creases, or does not vanish at all. For the ANR impossibility on the non-existence of voting rules that simultaneously satisfy anonymity, neutrality, and resolvability, we characterize the rate for the impossibility to vanish, to be either polynomially fast or exponentially fast. We also propose a novel easy-to-compute tie-breaking mechanism that optimally preserves anonymity and neutrality for even number of alternatives in natural settings. Our results illustrate the smoothed possibility of social choice—even though the paradox and the impossibility theorem hold in the worst case, they may not be a big concern in practice. 1 Introduction Dealing with paradoxes and impossibility theorems is a major challenge in social choice theory, because “the force and widespread presence of impossibility results generated a consolidated sense of pessimism, and this became a dominant theme in welfare economics and social choice theory in general”, as the eminent economist Amartya Sen commented in his Nobel prize lecture [57]. Many paradoxes and impossibility theorems in social choice are based on worst-case analysis. Take perhaps the earliest one, namely Condorcet’s (voting) paradox [15], for example. Condorcet’s para- dox states that, when there are at least three alternatives, it is impossible for pairwise majority aggregation to be transitive. -
2017 Stevens Helen 1333274
This electronic thesis or dissertation has been downloaded from the King’s Research Portal at https://kclpure.kcl.ac.uk/portal/ Paradise Closed Energy, inspiration and making art in Rome in the works of Harriet Hosmer, William Wetmore Story, Elizabeth Barrett Browning, Nathaniel Hawthorne, Sophia Peabody Hawthorne, Elizabeth Gaskell and Henry James, 1847-1903 Stevens, Helen Christine Awarding institution: King's College London The copyright of this thesis rests with the author and no quotation from it or information derived from it may be published without proper acknowledgement. END USER LICENCE AGREEMENT Unless another licence is stated on the immediately following page this work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International licence. https://creativecommons.org/licenses/by-nc-nd/4.0/ You are free to copy, distribute and transmit the work Under the following conditions: Attribution: You must attribute the work in the manner specified by the author (but not in any way that suggests that they endorse you or your use of the work). Non Commercial: You may not use this work for commercial purposes. No Derivative Works - You may not alter, transform, or build upon this work. Any of these conditions can be waived if you receive permission from the author. Your fair dealings and other rights are in no way affected by the above. Take down policy If you believe that this document breaches copyright please contact [email protected] providing details, and we will remove access to the work immediately and investigate your claim. Download date: 02. Oct. 2021 Paradise Closed: Energy, inspiration and making art in Rome in the works of Harriet Hosmer, William Wetmore Story, Elizabeth Barrett Browning, Nathaniel Hawthorne, Sophia Peabody Hawthorne, Elizabeth Gaskell and Henry James, 1847-1903 Helen Stevens 1 Thesis submitted for the degree of PhD King’s College London September, 2016 2 Table of Contents Acknowledgements .................................................................................................... -
Smoothed Complexity Theory And, Thus, to Embed Smoothed Analysis Into Compu- Tational Complexity
Smoothed Complexity Theory∗ Markus Bl¨aser1 Bodo Manthey2 1Saarland University, Department of Computer Science, [email protected] 2University of Twente, Department of Applied Mathematics, [email protected] Smoothed analysis is a new way of analyzing algorithms introduced by Spielman and Teng (J. ACM, 2004). Classical methods like worst-case or average-case anal- ysis have accompanying complexity classes, like P and Avg-P, respectively. While worst-case or average-case analysis give us a means to talk about the running time of a particular algorithm, complexity classes allows us to talk about the inherent difficulty of problems. Smoothed analysis is a hybrid of worst-case and average-case analysis and com- pensates some of their drawbacks. Despite its success for the analysis of single algorithms and problems, there is no embedding of smoothed analysis into com- putational complexity theory, which is necessary to classify problems according to their intrinsic difficulty. We propose a framework for smoothed complexity theory, define the relevant classes, and prove some first hardness results (of bounded halting and tiling) and tractability results (binary optimization problems, graph coloring, satisfiability). Furthermore, we discuss extensions and shortcomings of our model and relate it to semi-random models. 1 Introduction The goal of computational complexity theory is to classify computational problems according to their intrinsic difficulty. While the analysis of algorithms is concerned with analyzing, say, the running time of a particular algorithm, complexity theory rather analyses the amount of resources that all algorithms need at least to solve a given problem. Classical complexity classes, like P, reflect worst-case analysis of algorithms. -
Smoothed Analysis of the 2-Opt Heuristic for the TSP: Polynomial Bounds for Gaussian Noise∗
Smoothed Analysis of the 2-Opt Heuristic for the TSP: Polynomial Bounds for Gaussian Noise∗ Bodo Manthey Rianne Veenstra University of Twente, Enschede, Netherlands, Department of Applied Mathematics [email protected], [email protected] The 2-opt heuristic is a very simple local search heuristic for the traveling sales- man problem. While it usually converges quickly in practice, its running-time can be exponential in the worst case. Englert, R¨oglin,and V¨ocking (Algorithmica, 2014) provided a smoothed analysis in the so-called one-step model in order to explain the performance of 2-opt on d- dimensional Euclidean instances. However, translating their results to the classical model of smoothed analysis, where points are perturbed by Gaussian distributions with standard deviation σ, yields only bounds that are polynomial in n and 1/σd. We prove bounds that are polynomial in n and 1/σ for the smoothed running- time with Gaussian perturbations. In addition, our analysis for Euclidean distances is much simpler than the existing smoothed analysis. 1 2-Opt and Smoothed Analysis The traveling salesman problem (TSP) is one of the classical combinatorial optimization prob- lems. Euclidean TSP is the following variant: given points X ⊆ [0; 1]d, find the shortest Hamiltonian cycle that visits all points in X (also called a tour). Even this restricted variant is NP-hard for d ≥ 2 [21]. We consider Euclidean TSP with Manhattan and Euclidean dis- tances as well as squared Euclidean distances to measure the distances between points. For the former two, there exist polynomial-time approximation schemes (PTAS) [2, 20]. -
The Next Digital Decade Essays on the Future of the Internet
THE NEXT DIGITAL DECADE ESSAYS ON THE FUTURE OF THE INTERNET Edited by Berin Szoka & Adam Marcus THE NEXT DIGITAL DECADE ESSAYS ON THE FUTURE OF THE INTERNET Edited by Berin Szoka & Adam Marcus NextDigitalDecade.com TechFreedom techfreedom.org Washington, D.C. This work was published by TechFreedom (TechFreedom.org), a non-profit public policy think tank based in Washington, D.C. TechFreedom’s mission is to unleash the progress of technology that improves the human condition and expands individual capacity to choose. We gratefully acknowledge the generous and unconditional support for this project provided by VeriSign, Inc. More information about this book is available at NextDigitalDecade.com ISBN 978-1-4357-6786-7 © 2010 by TechFreedom, Washington, D.C. This work is licensed under the Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/3.0/ or send a letter to Creative Commons, 171 Second Street, Suite 300, San Francisco, California, 94105, USA. Cover Designed by Jeff Fielding. THE NEXT DIGITAL DECADE: ESSAYS ON THE FUTURE OF THE INTERNET 3 TABLE OF CONTENTS Foreword 7 Berin Szoka 25 Years After .COM: Ten Questions 9 Berin Szoka Contributors 29 Part I: The Big Picture & New Frameworks CHAPTER 1: The Internet’s Impact on Culture & Society: Good or Bad? 49 Why We Must Resist the Temptation of Web 2.0 51 Andrew Keen The Case for Internet Optimism, Part 1: Saving the Net from Its Detractors 57 Adam Thierer CHAPTER 2: Is the Generative -
The Flajolet-Martin Sketch Itself Preserves Differential Privacy: Private Counting with Minimal Space
The Flajolet-Martin Sketch Itself Preserves Differential Privacy: Private Counting with Minimal Space Adam Smith Shuang Song Abhradeep Thakurta Boston University Google Research, Brain Team Google Research, Brain Team [email protected] [email protected] [email protected] Abstract We revisit the problem of counting the number of distinct elements F0(D) in a data stream D, over a domain [u]. We propose an ("; δ)-differentially private algorithm that approximates F0(D) within a factor of (1 ± γ), and with additive error of p O( ln(1/δ)="), using space O(ln(ln(u)/γ)/γ2). We improve on the prior work at least quadratically and up to exponentially, in terms of both space and additive p error. Our additive error guarantee is optimal up to a factor of O( ln(1/δ)), n ln(u) 1 o and the space bound is optimal up to a factor of O min ln γ ; γ2 . We assume the existence of an ideal uniform random hash function, and ignore the space required to store it. We later relax this requirement by assuming pseudo- random functions and appealing to a computational variant of differential privacy, SIM-CDP. Our algorithm is built on top of the celebrated Flajolet-Martin (FM) sketch. We show that FM-sketch is differentially private as is, as long as there are p ≈ ln(1/δ)=(εγ) distinct elements in the data set. Along the way, we prove a structural result showing that the maximum of k i.i.d. random variables is statisti- cally close (in the sense of "-differential privacy) to the maximum of (k + 1) i.i.d. -
Public Cultii En
Ursula Klein MPIWG Berlin [email protected] The Public Cult of Natural History and Chemistry in the age of Romanticism In einem Zeitalter wo man alles Nützliche mit so viel Enthusiasmus auffasst… (I. v. Born und F. W. H. v. Trebra)1 The chemical nature of the novel, criticism, wit, sociability, the newest rhetoric and history thus far is self-evident (F. Schlegel)2 Chemistry is of the most widespread application and of the most boundless influence on life (Goethe)3 In May 1800 the philosopher and theologian Friedrich Daniel Ernst Schleiermacher wrote his sister Charlotte that at the time he was attending in Berlin “lectures about all kinds of sciences, … among others about chemistry.”4 The “chemistry” to which he was referring were the regular public lectures given by the Berlin chemist and apothecary Martin Heinrich Klaproth starting in 1783. The theologian-philosopher’s interest in chemistry was by no means superficial, as his lecture notes document.5 What was it about chemistry of that time that fascinated Schleiermacher? Goethe’s Elective Affinity (“Wahlverwandtschaft”) immediately comes to mind, in which a tragic conflict between marital bonds and love is portrayed in the light of chemical affinities or “elective affinities,” as do Johann Wilhelm Ritter’s spectacular electrochemical experiments and speculations on natural philosophy. Yet a look at Schleiermacher’s notes on Klaproth’s lectures from the year 1800 immediately informs us that this would be mistaken. Very little mention of chemical theories can be found in Klaproth’s lectures, let alone of speculative natural philosophy. In his lectures Klaproth betook himself instead to the multi-faceted world of material substances. -
Smoothed Analysis of the TSP Algorithms
Smoothed Analysis of the TSP Algorithms Jens Ziegler Geboren am 11. Februar 1989 in Siegburg September 22, 2012 Bachelorarbeit Mathematik Betreuer: Prof. Dr. Nitin Saxena Mathematisches Institut Mathematisch-Naturwissenschaftliche Fakultat¨ der Rheinischen Friedrich-Wilhelms-Universitat¨ Bonn Contents 1 Introduction 2 2 Preliminaries 6 3 2-Opt 8 3.1 Expected Number of 2-Changes on the L1 Metric . 8 3.1.1 Pairs of Linked 2-Changes . 10 3.1.2 Analysing the Pairs of Linked 2-Changes . 12 3.1.3 Proofs of Theorem 1 a) and Theorem 2 a) . 16 3.2 Expected Number of 2-Changes on the L2 Metric . 17 3.2.1 Analysis of a Single 2-Change . 18 3.2.2 Simplified Random Experiments . 29 3.2.3 Analysis of Pairs of Linked 2-Changes . 31 3.2.4 ProofsofTheorem1b)and2b) . 34 3.3 Expected Number of 2-Changes on General Graphs . 36 3.3.1 Definition of Witness Sequences . 36 3.3.2 Improvement to the Tour Made by Witness Sequences 37 3.3.3 Identifying Witness Sequences . 39 3.3.4 ProofofTheorem1c)and2c) . 43 3.4 ProofofTheorem3........................ 44 3.5 SmoothedAnalysisof2-Opt. 46 4 Karp’s Partitioning Scheme 48 4.1 Preliminaries ........................... 48 4.2 NecessaryLemmasandTheorems. 49 4.3 Smoothed Analysis of Karp’s Partitioning Scheme . 52 5 Conclusions and Open Questions 54 1 1 Introduction Smoothed Analysis was first introduced by Daniel A. Spielman and Shang- Hua Teng in 2001. Up until that time, there were only two standard ways of measuring an algorithms complexity: worst-case and average-case-analysis. -
Hardness of Non-Interactive Differential Privacy from One-Way
Hardness of Non-Interactive Differential Privacy from One-Way Functions Lucas Kowalczyk* Tal Malkin† Jonathan Ullman‡ Daniel Wichs§ May 30, 2018 Abstract A central challenge in differential privacy is to design computationally efficient non-interactive algorithms that can answer large numbers of statistical queries on a sensitive dataset. That is, we would like to design a differentially private algorithm that takes a dataset D Xn consisting of 2 some small number of elements n from some large data universe X, and efficiently outputs a summary that allows a user to efficiently obtain an answer to any query in some large family Q. Ignoring computational constraints, this problem can be solved even when X and Q are exponentially large and n is just a small polynomial; however, all algorithms with remotely similar guarantees run in exponential time. There have been several results showing that, under the strong assumption of indistinguishability obfuscation (iO), no efficient differentially private algorithm exists when X and Q can be exponentially large. However, there are no strong separations between information-theoretic and computationally efficient differentially private algorithms under any standard complexity assumption. In this work we show that, if one-way functions exist, there is no general purpose differen- tially private algorithm that works when X and Q are exponentially large, and n is an arbitrary polynomial. In fact, we show that this result holds even if X is just subexponentially large (assuming only polynomially-hard one-way functions). This result solves an open problem posed by Vadhan in his recent survey [Vad16]. *Columbia University Department of Computer Science. -
Communication Complexity (For Algorithm Designers)
Full text available at: http://dx.doi.org/10.1561/0400000076 Communication Complexity (for Algorithm Designers) Tim Roughgarden Stanford University, USA [email protected] Boston — Delft Full text available at: http://dx.doi.org/10.1561/0400000076 Foundations and Trends R in Theoretical Computer Science Published, sold and distributed by: now Publishers Inc. PO Box 1024 Hanover, MA 02339 United States Tel. +1-781-985-4510 www.nowpublishers.com [email protected] Outside North America: now Publishers Inc. PO Box 179 2600 AD Delft The Netherlands Tel. +31-6-51115274 The preferred citation for this publication is T. Roughgarden. Communication Complexity (for Algorithm Designers). Foundations and Trends R in Theoretical Computer Science, vol. 11, nos. 3-4, pp. 217–404, 2015. R This Foundations and Trends issue was typeset in LATEX using a class file designed by Neal Parikh. Printed on acid-free paper. ISBN: 978-1-68083-115-3 c 2016 T. Roughgarden All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, mechanical, photocopying, recording or otherwise, without prior written permission of the publishers. Photocopying. In the USA: This journal is registered at the Copyright Clearance Cen- ter, Inc., 222 Rosewood Drive, Danvers, MA 01923. Authorization to photocopy items for internal or personal use, or the internal or personal use of specific clients, is granted by now Publishers Inc for users registered with the Copyright Clearance Center (CCC). The ‘services’ for users can be found on the internet at: www.copyright.com For those organizations that have been granted a photocopy license, a separate system of payment has been arranged. -
Smoothed Analysis of Local Search Algorithms
Smoothed Analysis of Local Search Algorithms Bodo Manthey University of Twente, Department of Applied Mathematics Enschede, The Netherlands, [email protected] Abstract. Smoothed analysis is a method for analyzing the perfor- mance of algorithms for which classical worst-case analysis fails to ex- plain the performance observed in practice. Smoothed analysis has been applied to explain the performance of a variety of algorithms in the last years. One particular class of algorithms where smoothed analysis has been used successfully are local search algorithms. We give a survey of smoothed analysis, in particular applied to local search algorithms. 1 Smoothed Analysis 1.1 Motivation The goal of the analysis of algorithms is to provide measures for the performance of algorithms. In this way, it helps to compare algorithms and to understand their behavior. The most commonly used method for the performance of algorithms is worst-case analysis. If an algorithm has a good worst-case performance, then this is a very strong statement and, up to constants and lower order terms, the algorithm should also perform well in practice. However, there are many algorithms that work surprisingly well in practice although they have a very poor worst-case performance. The reason for this is that the worst-case performance can be dominated by a few pathological instances that hardly or never occur in practice. A frequently used alternative to worst-case analysis is average-case analysis. In average-case analysis, the expected performance is measured with respect to some fixed probability distribution. Many algorithms with poor worst-case but good practical performance show a good average-case performance. -
An Axiomatic Approach to Block Rewards
An Axiomatic Approach to Block Rewards Xi Chen Christos Papadimitriou Tim Roughgarden [email protected] [email protected] [email protected] Columbia University Columbia University Columbia University New York, NY 10027 New York, NY 10027 New York, NY 10027 ABSTRACT of view and the methodology of Economic Theory, the science of Proof-of-work blockchains reward each miner for one completed incentives. block by an amount that is, in expectation, proportional to the Flaws in the incentives of a blockchain protocol can manifest number of hashes the miner contributed to the mining of the block. themselves at multiple timescales. For longest-chain proof-of-work Is this proportional allocation rule optimal? And in what sense? And blockchains like Bitcoin, the most well-studied incentive-based what other rules are possible? In particular, what are the desirable attacks, such as selfish mining [4, 5, 10] and transaction sniping [3], properties that any łgoodž allocation rule should satisfy? To answer concern miners reasoning strategically over multiple block creation these questions, we embark on an axiomatic theory of incentives in epochs. For example, in selfish mining, a miner relinquishes revenue proof-of-work blockchains at the time scale of a single block. We in the short term to achieve greater revenue (in expectation) in the consider desirable properties of allocation rules including: symme- long run via a type of forking attack. try; budget balance (weak or strong); sybil-proofness; and various This paper studies incentive issues and potential deviations from grades of collusion-proofness. We show that Bitcoin’s proportional intended miner behavior at the most basic time scale, that of a allocation rule is the unique allocation rule satisfying a certain single block creation epoch.