Calibrating Noise to Variance in Adaptive Data Analysis
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Reproducibility and Pseudo-Determinism in Log-Space
Reproducibility and Pseudo-determinism in Log-Space by Ofer Grossman S.B., Massachusetts Institute of Technology (2017) Submitted to the Department of Electrical Engineering and Computer Science in partial fulfillment of the requirements for the degree of Master of Science in Electrical Engineering and Computer Science at the MASSACHUSETTS INSTITUTE OF TECHNOLOGY May 2020 c Massachusetts Institute of Technology 2020. All rights reserved. Author...................................................................... Department of Electrical Engineering and Computer Science May 15, 2020 Certified by.................................................................. Shafi Goldwasser RSA Professor of Electrical Engineering and Computer Science Thesis Supervisor Accepted by................................................................. Leslie A. Kolodziejski Professor of Electrical Engineering and Computer Science Chair, Department Committee on Graduate Students 2 Reproducibility and Pseudo-determinism in Log-Space by Ofer Grossman Submitted to the Department of Electrical Engineering and Computer Science on May 15, 2020, in partial fulfillment of the requirements for the degree of Master of Science in Electrical Engineering and Computer Science Abstract Acuriouspropertyofrandomizedlog-spacesearchalgorithmsisthattheiroutputsareoften longer than their workspace. This leads to the question: how can we reproduce the results of a randomized log space computation without storing the output or randomness verbatim? Running the algorithm again with new -
Investigating Statistical Privacy Frameworks from the Perspective of Hypothesis Testing 234 Able Quantity
Investigating Statistical Privacy Frameworks from the Perspective of Hypothesis Testing 234 able quantity. Only a limited number of previous works which can provide a natural and interpretable guide- [29, 34, 35] have investigated the question of how to line for selecting proper privacy parameters by system select a proper value of ǫ, but these approaches ei- designers and researchers. Furthermore, we extend our ther require complicated economic models or lack the analysis on unbounded DP to bounded DP and the ap- analysis of adversaries with arbitrary auxiliary informa- proximate (ǫ, δ)-DP. tion (see Section 2.3 for more details). Our work is in- Impact of Auxiliary Information. The conjecture spired by the interpretation of differential privacy via that auxiliary information can influence the design of hypothesis testing, initially introduced by Wasserman DP mechanisms has been made in prior work [8, 28, 32, and Zhou [27, 30, 60]. However, this interpretation has 33, 39, 65]. We therefore investigate the adversary’s ca- not been systematically investigated before in the con- pability based on hypothesis testing under three types text of our research objective, i.e., reasoning about the of auxiliary information: the prior distribution of the in- choice of the privacy parameter ǫ (see Section 2.4 for put record, the correlation across records, and the corre- more details). lation across time. Our analysis demonstrates that the We consider hypothesis testing [2, 45, 61] as the tool auxiliary information indeed influences the appropriate used by the adversary to infer sensitive information of selection of ǫ. The results suggest that, when possible an individual record (e.g., the presence or absence of and available, the practitioners of DP should explicitly a record in the database for unbounded DP) from the incorporate adversary’s auxiliary information into the outputs of privacy mechanisms. -
2008 Annual Report
2008 Annual Report NATIONAL ACADEMY OF ENGINEERING ENGINEERING THE FUTURE 1 Letter from the President 3 In Service to the Nation 3 Mission Statement 4 Program Reports 4 Engineering Education 4 Center for the Advancement of Scholarship on Engineering Education 6 Technological Literacy 6 Public Understanding of Engineering Developing Effective Messages Media Relations Public Relations Grand Challenges for Engineering 8 Center for Engineering, Ethics, and Society 9 Diversity in the Engineering Workforce Engineer Girl! Website Engineer Your Life Project Engineering Equity Extension Service 10 Frontiers of Engineering Armstrong Endowment for Young Engineers-Gilbreth Lectures 12 Engineering and Health Care 14 Technology and Peace Building 14 Technology for a Quieter America 15 America’s Energy Future 16 Terrorism and the Electric Power-Delivery System 16 U.S.-China Cooperation on Electricity from Renewables 17 U.S.-China Symposium on Science and Technology Strategic Policy 17 Offshoring of Engineering 18 Gathering Storm Still Frames the Policy Debate 20 2008 NAE Awards Recipients 22 2008 New Members and Foreign Associates 24 2008 NAE Anniversary Members 28 2008 Private Contributions 28 Einstein Society 28 Heritage Society 29 Golden Bridge Society 29 Catalyst Society 30 Rosette Society 30 Challenge Society 30 Charter Society 31 Other Individual Donors 34 The Presidents’ Circle 34 Corporations, Foundations, and Other Organizations 35 National Academy of Engineering Fund Financial Report 37 Report of Independent Certified Public Accountants 41 Notes to Financial Statements 53 Officers 53 Councillors 54 Staff 54 NAE Publications Letter from the President Engineering is critical to meeting the fundamental challenges facing the U.S. economy in the 21st century. -
Digital Communication Systems 2.2 Optimal Source Coding
Digital Communication Systems EES 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 2. Source Coding 2.2 Optimal Source Coding: Huffman Coding: Origin, Recipe, MATLAB Implementation 1 Examples of Prefix Codes Nonsingular Fixed-Length Code Shannon–Fano code Huffman Code 2 Prof. Robert Fano (1917-2016) Shannon Award (1976 ) Shannon–Fano Code Proposed in Shannon’s “A Mathematical Theory of Communication” in 1948 The method was attributed to Fano, who later published it as a technical report. Fano, R.M. (1949). “The transmission of information”. Technical Report No. 65. Cambridge (Mass.), USA: Research Laboratory of Electronics at MIT. Should not be confused with Shannon coding, the coding method used to prove Shannon's noiseless coding theorem, or with Shannon–Fano–Elias coding (also known as Elias coding), the precursor to arithmetic coding. 3 Claude E. Shannon Award Claude E. Shannon (1972) Elwyn R. Berlekamp (1993) Sergio Verdu (2007) David S. Slepian (1974) Aaron D. Wyner (1994) Robert M. Gray (2008) Robert M. Fano (1976) G. David Forney, Jr. (1995) Jorma Rissanen (2009) Peter Elias (1977) Imre Csiszár (1996) Te Sun Han (2010) Mark S. Pinsker (1978) Jacob Ziv (1997) Shlomo Shamai (Shitz) (2011) Jacob Wolfowitz (1979) Neil J. A. Sloane (1998) Abbas El Gamal (2012) W. Wesley Peterson (1981) Tadao Kasami (1999) Katalin Marton (2013) Irving S. Reed (1982) Thomas Kailath (2000) János Körner (2014) Robert G. Gallager (1983) Jack KeilWolf (2001) Arthur Robert Calderbank (2015) Solomon W. Golomb (1985) Toby Berger (2002) Alexander S. Holevo (2016) William L. Root (1986) Lloyd R. Welch (2003) David Tse (2017) James L. -
FOCS 2005 Program SUNDAY October 23, 2005
FOCS 2005 Program SUNDAY October 23, 2005 Talks in Grand Ballroom, 17th floor Session 1: 8:50am – 10:10am Chair: Eva´ Tardos 8:50 Agnostically Learning Halfspaces Adam Kalai, Adam Klivans, Yishay Mansour and Rocco Servedio 9:10 Noise stability of functions with low influences: invari- ance and optimality The 46th Annual IEEE Symposium on Elchanan Mossel, Ryan O’Donnell and Krzysztof Foundations of Computer Science Oleszkiewicz October 22-25, 2005 Omni William Penn Hotel, 9:30 Every decision tree has an influential variable Pittsburgh, PA Ryan O’Donnell, Michael Saks, Oded Schramm and Rocco Servedio Sponsored by the IEEE Computer Society Technical Committee on Mathematical Foundations of Computing 9:50 Lower Bounds for the Noisy Broadcast Problem In cooperation with ACM SIGACT Navin Goyal, Guy Kindler and Michael Saks Break 10:10am – 10:30am FOCS ’05 gratefully acknowledges financial support from Microsoft Research, Yahoo! Research, and the CMU Aladdin center Session 2: 10:30am – 12:10pm Chair: Satish Rao SATURDAY October 22, 2005 10:30 The Unique Games Conjecture, Integrality Gap for Cut Problems and Embeddability of Negative Type Metrics Tutorials held at CMU University Center into `1 [Best paper award] Reception at Omni William Penn Hotel, Monongahela Room, Subhash Khot and Nisheeth Vishnoi 17th floor 10:50 The Closest Substring problem with small distances Tutorial 1: 1:30pm – 3:30pm Daniel Marx (McConomy Auditorium) Chair: Irit Dinur 11:10 Fitting tree metrics: Hierarchical clustering and Phy- logeny Subhash Khot Nir Ailon and Moses Charikar On the Unique Games Conjecture 11:30 Metric Embeddings with Relaxed Guarantees Break 3:30pm – 4:00pm Ittai Abraham, Yair Bartal, T-H. -
An Economic Analysis of Privacy Protection and Statistical Accuracy As Social Choices
An Economic Analysis of Privacy Protection and Statistical Accuracy as Social Choices John M. Abowd and Ian M. Schmutte August 15, 2018 Forthcoming in American Economic Review Abowd: U.S. Census Bureau HQ 8H120, 4600 Silver Hill Rd., Washington, DC 20233, and Cornell University, (email: [email protected]); Schmutte: Department of Eco- nomics, University of Georgia, B408 Amos Hall, Athens, GA 30602 (email: [email protected]). Abowd and Schmutte acknowledge the support of Alfred P. Sloan Foundation Grant G-2015-13903 and NSF Grant SES-1131848. Abowd acknowledges direct support from NSF Grants BCS-0941226, TC-1012593. Any opinions and conclusions are those of the authors and do not represent the views of the Census Bureau, NSF, or the Sloan Foundation. We thank the Center for Labor Economics at UC–Berkeley and Isaac Newton Institute for Mathematical Sciences, Cambridge (EPSRC grant no. EP/K032208/1) for support and hospitality. We are extremely grateful for very valuable com- ments and guidance from the editor, Pinelopi Goldberg, and six anonymous referees. We acknowl- edge helpful comments from Robin Bachman, Nick Bloom, Larry Blume, David Card, Michael Castro, Jennifer Childs, Melissa Creech, Cynthia Dwork, Casey Eggleston, John Eltinge, Stephen Fienberg, Mark Kutzbach, Ron Jarmin, Christa Jones, Dan Kifer, Ashwin Machanavajjhala, Frank McSherry, Gerome Miklau, Kobbi Nissim, Paul Oyer, Mallesh Pai, Jerry Reiter, Eric Slud, Adam Smith, Bruce Spencer, Sara Sullivan, Salil Vadhan, Lars Vilhuber, Glen Weyl, and Nellie Zhao along with seminar and conference participants at the U.S. Census Bureau, Cornell, CREST, George Ma- son, Georgetown, Microsoft Research–NYC, University of Washington Evans School, and SOLE. -
Calibrating Noise to Sensitivity in Private Data Analysis
Calibrating Noise to Sensitivity in Private Data Analysis Cynthia Dwork1, Frank McSherry1, Kobbi Nissim2, and Adam Smith3? 1 Microsoft Research, Silicon Valley. {dwork,mcsherry}@microsoft.com 2 Ben-Gurion University. [email protected] 3 Weizmann Institute of Science. [email protected] Abstract. We continue a line of research initiated in [10, 11] on privacy- preserving statistical databases. Consider a trusted server that holds a database of sensitive information. Given a query function f mapping databases to reals, the so-called true answer is the result of applying f to the database. To protect privacy, the true answer is perturbed by the addition of random noise generated according to a carefully chosen distribution, and this response, the true answer plus noise, is returned to the user. Previous work focused on the case of noisy sums, in which f = P i g(xi), where xi denotes the ith row of the database and g maps database rows to [0, 1]. We extend the study to general functions f, proving that privacy can be preserved by calibrating the standard devi- ation of the noise according to the sensitivity of the function f. Roughly speaking, this is the amount that any single argument to f can change its output. The new analysis shows that for several particular applications substantially less noise is needed than was previously understood to be the case. The first step is a very clean characterization of privacy in terms of indistinguishability of transcripts. Additionally, we obtain separation re- sults showing the increased value of interactive sanitization mechanisms over non-interactive. -
Magic Adversaries Versus Individual Reduction: Science Wins Either Way ?
Magic Adversaries Versus Individual Reduction: Science Wins Either Way ? Yi Deng1;2 1 SKLOIS, Institute of Information Engineering, CAS, Beijing, P.R.China 2 State Key Laboratory of Cryptology, P. O. Box 5159, Beijing ,100878,China [email protected] Abstract. We prove that, assuming there exists an injective one-way function f, at least one of the following statements is true: – (Infinitely-often) Non-uniform public-key encryption and key agreement exist; – The Feige-Shamir protocol instantiated with f is distributional concurrent zero knowledge for a large class of distributions over any OR NP-relations with small distinguishability gap. The questions of whether we can achieve these goals are known to be subject to black-box lim- itations. Our win-win result also establishes an unexpected connection between the complexity of public-key encryption and the round-complexity of concurrent zero knowledge. As the main technical contribution, we introduce a dissection procedure for concurrent ad- versaries, which enables us to transform a magic concurrent adversary that breaks the distribu- tional concurrent zero knowledge of the Feige-Shamir protocol into non-black-box construc- tions of (infinitely-often) public-key encryption and key agreement. This dissection of complex algorithms gives insight into the fundamental gap between the known universal security reductions/simulations, in which a single reduction algorithm or simu- lator works for all adversaries, and the natural security definitions (that are sufficient for almost all cryptographic primitives/protocols), which switch the order of qualifiers and only require that for every adversary there exists an individual reduction or simulator. 1 Introduction The seminal work of Impagliazzo and Rudich [IR89] provides a methodology for studying the lim- itations of black-box reductions. -
Secure Multi-Party Computation in Practice
SECURE MULTI-PARTY COMPUTATION IN PRACTICE Marcella Christine Hastings A DISSERTATION in Computer and Information Science Presented to the Faculties of the University of Pennsylvania in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy 2021 Supervisor of Dissertation Nadia Heninger Adjunct Associate Professor, University of Pennsylvania Associate Professor, University of California, San Diego Graduate Group Chairperson Mayur Naik Professor and Computer and Information Science Graduate Group Chair Dissertation Committee Brett Hemenway Falk, Research Assistant Professor Stephan A. Zdancewic, Professor Sebastian Angel, Raj and Neera Singh Term Assistant Professor abhi shelat, Associate Professor at Khoury College of Computer Sciences, Northeastern University ACKNOWLEDGMENT This dissertation would have been much less pleasant to produce without the presence of many people in my life. I would like to thank my advisor and my dissertation committee for their helpful advice, direction, and long-distance phone calls over the past six years. I would like to thank my fellow PhD students at the University of Pennsylvania, especially the ever-changing but consistently lovely office mates in the Distributed Systems Laboratory and my cohort. Our shared tea-time, cookies, disappointments, and achievements provided an essential community that brought me great joy during my time at Penn. I would like to thank the mentors and colleagues who hosted me at the Security and Privacy Lab at the University of Washington in 2018, the Software & Application Innovation Lab at Boston University in 2019, and the Cryptography and Privacy Research group at Microsoft Research in 2020. My career and happiness greatly benefited from spending these summers exploring fresh research directions and rediscovering the world outside my own work. -
A Decade of Lattice Cryptography
Full text available at: http://dx.doi.org/10.1561/0400000074 A Decade of Lattice Cryptography Chris Peikert Computer Science and Engineering University of Michigan, United States Boston — Delft Full text available at: http://dx.doi.org/10.1561/0400000074 Foundations and Trends R in Theoretical Computer Science Published, sold and distributed by: now Publishers Inc. PO Box 1024 Hanover, MA 02339 United States Tel. +1-781-985-4510 www.nowpublishers.com [email protected] Outside North America: now Publishers Inc. PO Box 179 2600 AD Delft The Netherlands Tel. +31-6-51115274 The preferred citation for this publication is C. Peikert. A Decade of Lattice Cryptography. Foundations and Trends R in Theoretical Computer Science, vol. 10, no. 4, pp. 283–424, 2014. R This Foundations and Trends issue was typeset in LATEX using a class file designed by Neal Parikh. Printed on acid-free paper. ISBN: 978-1-68083-113-9 c 2016 C. Peikert All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, mechanical, photocopying, recording or otherwise, without prior written permission of the publishers. Photocopying. In the USA: This journal is registered at the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923. Authorization to photocopy items for in- ternal or personal use, or the internal or personal use of specific clients, is granted by now Publishers Inc for users registered with the Copyright Clearance Center (CCC). The ‘services’ for users can be found on the internet at: www.copyright.com For those organizations that have been granted a photocopy license, a separate system of payment has been arranged. -
Hard Communication Channels for Steganography
Hard Communication Channels for Steganography Sebastian Berndt1 and Maciej Liśkiewicz2 1 University of Lübeck, Lübeck, Germany [email protected] 2 University of Lübeck, Lübeck, Germany [email protected] Abstract This paper considers steganography – the concept of hiding the presence of secret messages in legal communications – in the computational setting and its relation to cryptography. Very re- cently the first (non-polynomial time) steganographic protocol has been shown which, for any communication channel, is provably secure, reliable, and has nearly optimal bandwidth. The security is unconditional, i.e. it does not rely on any unproven complexity-theoretic assumption. This disproves the claim that the existence of one-way functions and access to a communication channel oracle are both necessary and sufficient conditions for the existence of secure steganogra- phy in the sense that secure and reliable steganography exists independently of the existence of one-way functions. In this paper, we prove that this equivalence also does not hold in the more realistic setting, where the stegosystem is polynomial time bounded. We prove this by construct- ing (a) a channel for which secure steganography exists if and only if one-way functions exist and (b) another channel such that secure steganography implies that no one-way functions exist. We therefore show that security-preserving reductions between cryptography and steganography need to be treated very carefully. 1998 ACM Subject Classification E.3 Data Encryption Keywords and phrases provable secure steganography, cryptographic assumptions, pseudoran- dom functions, one-way functions, signature schemes Digital Object Identifier 10.4230/LIPIcs.ISAAC.2016.16 1 Introduction Digital steganography has recently received substantial interest in modern computer science since it allows secret communication without revealing its presence. -
Typical Stability
Typical Stability Raef Bassily∗ Yoav Freundy Abstract In this paper, we introduce a notion of algorithmic stability called typical stability. When our goal is to release real-valued queries (statistics) computed over a dataset, this notion does not require the queries to be of bounded sensitivity – a condition that is generally assumed under differential privacy [DMNS06, Dwo06] when used as a notion of algorithmic stability [DFH+15b, DFH+15c, BNS+16] – nor does it require the samples in the dataset to be independent – a condition that is usually assumed when generalization-error guarantees are sought. Instead, typical stability requires the output of the query, when computed on a dataset drawn from the underlying distribution, to be concentrated around its expected value with respect to that distribution. Typical stability can also be motivated as an alternative definition for database privacy. Like differential privacy, this notion enjoys several important properties including robustness to post-processing and adaptive composition. However, privacy is guaranteed only for a given family of distributions over the dataset. We also discuss the implications of typical stability on the generalization error (i.e., the difference between the value of the query computed on the dataset and the expected value of the query with respect to the true data distribution). We show that typical stability can control generalization error in adaptive data analysis even when the samples in the dataset are not necessarily independent and when queries to be computed are not necessarily of bounded- sensitivity as long as the results of the queries over the dataset (i.e., the computed statistics) follow a distribution with a “light” tail.