How Data Privacy Engineering Will Prevent Future Data Oil Spills
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Data Privacy: De-Identification Techniques
DEVELOPING AND CONNECTING ISSA CYBERSECURITY LEADERS GLOBALLY Data Privacy: De-Identification Techniques By Ulf Mattsson – ISSA member, New York Chapter This article discusses emerging data privacy techniques, standards, and examples of applications implementing different use cases of de-identification techniques. We will discuss different attack scenarios and practical balances between privacy requirements and operational requirements. Abstract The data privacy landscape is changing. There is a need for privacy models in the current landscape of the increas- ing numbers of privacy regulations and privacy breaches. Privacy methods use models and it is important to have a common language when defining privacy rules. This article will discuss practical recommendations to find the right practical balance between compli- ance, security, privacy, and operational requirements for each type of data and business use case. Figure 1 – Forrester’s global map of privacy rights and regulations [4] ensitive data can be exposed to internal users, partners, California Customer Privacy Act (CCPA) is a wake-up call, and attackers. Different data protection techniques can addressing identification of individuals via data inference provide a balance between protection and transparen- through a broader range of PII attributes. CCPA defines Scy to business processes. The requirements are different for personal information as information that identifies, relates systems that are operational, analytical, or test/development to, describes, is reasonably capable of being associated with, as illustrated by some practical examples in this article. or could reasonably be linked, directly or indirectly, with a We will discuss different aspects of various data privacy tech- particular consumer or household such as a real name, alias, niques, including data truthfulness, applicability to different postal address, and unique personal identifier [1]. -
Leveraging GDPR to Become a Trusted Data Steward
The Boston Consulting Group (BCG) is a global management consulting firm and the world’s leading advisor on business strategy. We partner with clients from the private, public, and not-for- profit sectors in all regions to identify their highest-value opportunities, address their most critical challenges, and transform their enterprises. Our customized approach combines deep insight into the dynamics of companies and markets with close collaboration at all levels of the client organization. This ensures that our clients achieve sustainable competitive advantage, build more capable organizations, and secure lasting results. Founded in 1963, BCG is a private company with offices in more than 90 cities in 50 countries. For more information, please visit bcg.com. DLA Piper is a global law firm with lawyers located in more than 40 countries throughout the Ameri- cas, Europe, the Middle East, Africa and Asia Pa- cific, positioning us to help clients with their legal needs around the world. We strive to be the leading global business law firm by delivering quality and value to our clients. We achieve this through practical and innovative legal solutions that help our clients succeed. We deliver consistent services across our platform of practices and sectors in all matters we undertake. Our clients range from multinational, Global 1000, and Fortune 500 enterprises to emerging compa- nies developing industry-leading technologies. They include more than half of the Fortune 250 and nearly half of the FTSE 350 or their subsidi- aries. We also advise governments and public sector bodies. Leveraging GDPR to Become a Trusted Data Steward Patrick Van Eecke, Ross McKean, Denise Lebeau-Marianna, Jeanne Dauzier: DLA Piper Elias Baltassis, John Rose, Antoine Gourevitch, Alexander Lawrence: BCG March 2018 AT A GLANCE The European Union’s new General Data Protection Regulation, which aims to streng- then protections for consumers’ data privacy, creates an opportunity for companies to establish themselves as trusted stewards of consumer data. -
National Privacy Research Strategy
NATIONAL PRIVACY RESEARCH STRATEGY National Science and Technology Council Networking and Information Technology Research and Development Program June 2016 National Privacy Research Strategy About the National Science and Technology Council The National Science and Technology Council (NSTC) is the principal means by which the Executive Branch coordinates science and technology policy across the diverse entities that make up the Federal research and development (R&D) enterprise. One of the NSTC’s primary objectives is establishing clear national goals for Federal science and technology investments. The NSTC prepares R&D packages aimed at accomplishing multiple national goals. The NSTC’s work is organized under five committees: Environment, Natural Resources, and Sustainability; Homeland and National Security; Science, Technology, Engineering, and Mathematics (STEM) Education; Science; and Technology. Each of these committees oversees subcommittees and working groups that are focused on different aspects of science and technology. More information is available at www.whitehouse.gov/ostp/nstc. About the Office of Science and Technology Policy The Office of Science and Technology Policy (OSTP) was established by the National Science and Technology Policy, Organization, and Priorities Act of 1976. OSTP’s responsibilities include advising the President in policy formulation and budget development on questions in which science and technology are important elements; articulating the President’s science and technology policy and programs; and fostering strong partnerships among Federal, state, and local governments, and the scientific communities in industry and academia. The Director of OSTP also serves as Assistant to the President for Science and Technology and manages the NSTC. More information is available at www.whitehouse.gov/ostp. -
Why Pseudonyms Don't Anonymize
Why Pseudonyms Don’t Anonymize: A Computational Re-identification Analysis of Genomic Data Privacy Protection Systems Bradley Malin Data Privacy Laboratory, School of Computer Science Carnegie Mellon University, Pittsburgh, Pennsylvania 15213-3890 USA [email protected] Abstract Objectives: In order to supply patient-derived genomic data for medical research purposes, care must be taken to protect the identities of the patients from unwanted intrusions. Recently, several protection techniques that employ the use of trusted third parties (TTPs), and other identity protecting schemas, have been proposed and deployed. The goal of this paper is to analyze the susceptibility of genomic data privacy protection methods based on such methods to known re-identification attacks. Methods: Though advocates of data de-identification and pseudonymization have the best of intentions, oftentimes, these methods fail to preserve anonymity. Methods that rely solely TTPs do not guard against various re-identification methods, which can link apparently anonymous, or pseudonymous, genomic data to named individuals. The re-identification attacks we analyze exploit various types of information that leaked by genomic data that can be used to infer identifying features. These attacks include genotype-phenotype inference attacks, trail attacks which exploit location visit patterns of patients in a distributed environment, and family structure based attacks. Results: While susceptibility varies, we find that each of the protection methods studied is deficient in their protection against re-identification. In certain instances the protection schema itself, such as singly-encrypted pseudonymization, can be leveraged to compromise privacy even further than simple de-identification permits. In order to facilitate the future development of privacy protection methods, we provide a susceptibility comparison of the methods. -
Contribution to Study Period on Privacy Engineering Framework
PReparing Industry to Privacy-by-design by supporting its Application in REsearch Contribution to Study Period on Privacy Engineering Framework Project: PRIPARE Project Number: ICT -6106 Title: Contribution to Study Period on Privacy Engineering Framework Version: v1.0 Part of the Seventh Date: 06/08/2015 Framework Programme Confidentiality: Public Funded by the EC - DG CNECT Author/s: Antonio Kung, Christophe Jouvray (Trialog), Nicolas Notario, Alberto Crespo (Atos), Samuel Martin, José del Álamo (UPM), Carmela Troncoso (Gradiant). PRIPARE Contribution to Study Period on Privacy Engineering Framework v1.0 Table of Contents SUMMARY ..........................................................................................................4 LIST OF FIGURES ..................................................................................................5 ABBREVIATIONS AND DEFINITIONS .....................................................................5 1 INTRODUCTION ............................................................................................6 2 PRIVACY FRAMEWORK VERSUS PRIVACY ENGINEERING FRAMEWORK .........7 2.1 ABOUT FRAMEWORKS ................................................................................................ 7 2.2 POSITIONING PRIVACY ENGINEERING IN ORGANISATIONS..................................................... 7 2.3 WHY A PRIVACY ENGINEERING FRAMEWORK? ................................................................ 10 2.3.1 Need for Convergence of Terms.................................................................. -
Opening & Welcoming Remarks
Tuesday, April 10 – Wednesday, April 11, 2018, Washington, DC Opening & Welcoming Remarks Speaker 1 Joe Bhatia, President and CEO, ANSI Welcoming Remarks from ANSI Joe Bhatia has been president and CEO of the American National Standards Institute (ANSI) since January 2006. He previously served as executive vice president and COO of the international group at Underwriters Laboratories (UL). Mr. Bhatia serves as vice chairman of the Industry Trade Advisory Committee on Standards and Technical Trade Barriers (ITAC 16), a joint program of the U.S. Department of Commerce and U.S. Trade Representative. He is a member of the International Organization for Standardization (ISO) Council and its Council Standing Committee on Finance, and holds a seat on the Oakton Community College Education Foundation Board. In 2017 he concluded his term as president of the Pan American Standards Commission (COPANT), where he also served as vice president for four years. Speaker 2 Christoph Winterhalter, Chairman of the Executive Board of DIN Welcoming Remarks from DIN After studying computer science at the University of Karlsruhe Winterhalter started his career at ABB. After assignments in Norway, USA and Germany he took over the business units robot automation and robotics products. In 2010 he became director of the German Research Center of ABB until he was promoted global Product Group manager heading ABB’s global Machinery Controls and Automation business and later Hub Business Manager Control Technologies. Since July 2016 he is Chairman of the Executive Board of DIN. Speaker 3 Thomas Sentko, Standards Manager, International of DKE Welcoming remarks from DKE 2 Thomas studied electrical engineering/telecommunications at the University of Applied Sciences Darmstadt and graduated with the degree Dipl.Ing. -
Privacy Engineering for Social Networks
UCAM-CL-TR-825 Technical Report ISSN 1476-2986 Number 825 Computer Laboratory Privacy engineering for social networks Jonathan Anderson December 2012 15 JJ Thomson Avenue Cambridge CB3 0FD United Kingdom phone +44 1223 763500 http://www.cl.cam.ac.uk/ c 2012 Jonathan Anderson This technical report is based on a dissertation submitted July 2012 by the author for the degree of Doctor of Philosophy to the University of Cambridge, Trinity College. Technical reports published by the University of Cambridge Computer Laboratory are freely available via the Internet: http://www.cl.cam.ac.uk/techreports/ ISSN 1476-2986 Privacy engineering for social networks Jonathan Anderson In this dissertation, I enumerate several privacy problems in online social net- works (OSNs) and describe a system called Footlights that addresses them. Foot- lights is a platform for distributed social applications that allows users to control the sharing of private information. It is designed to compete with the performance of today’s centralised OSNs, but it does not trust centralised infrastructure to en- force security properties. Based on several socio-technical scenarios, I extract concrete technical problems to be solved and show how the existing research literature does not solve them. Addressing these problems fully would fundamentally change users’ interactions with OSNs, providing real control over online sharing. I also demonstrate that today’s OSNs do not provide this control: both user data and the social graph are vulnerable to practical privacy attacks. Footlights’ storage substrate provides private, scalable, sharable storage using untrusted servers. Under realistic assumptions, the direct cost of operating this storage system is less than one US dollar per user-year. -
Predictability for Privacy in Data Driven Government
View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by University of Minnesota Law School Minnesota Journal of Law, Science & Technology Volume 20 Issue 1 Article 3 1-6-2019 Predictability for Privacy in Data Driven Government Jordan Blanke Mercer University Janine Hiller Virginia Tech Follow this and additional works at: https://scholarship.law.umn.edu/mjlst Part of the Administrative Law Commons, Privacy Law Commons, and the Science and Technology Law Commons Recommended Citation Jordan Blanke & Janine Hiller, Predictability for Privacy in Data Driven Government, 20 MINN. J.L. SCI. & TECH. 32 (2018). Available at: https://scholarship.law.umn.edu/mjlst/vol20/iss1/3 The Minnesota Journal of Law, Science & Technology is published by the University of Minnesota Libraries Publishing. Predictability for Privacy in Data Driven Government Jordan M. Blanke* and Janine S. Hiller† Abstract The Deferred Action for Childhood Arrivals program (DACA) required individuals to provide a great deal of personal information in order to participate and remain in the United States legally; could information in the same system now be used for deportations? More broadly, how should systems of data that are created legitimately by United States agencies and compiled for one reason, be used for other reasons? The increasing emphasis on “smart cities” that use data to efficiently provide and plan for service delivery will require the integration of data from multiple government and non- government sources, in ways that citizens may not expect. There are increasing calls for the federal government to open up and share the data collected for one reason for use in additional, unrelated ways, and to combine that data with data collected by commercial, private entities. -
Curriculum Vitae
Daniel Smullen Curriculum Vitae “Don’t have good ideas if you aren’t willing to be responsible for them.” —Alan Perlis About Me I solve socio-technical problems using interdisciplinary research methods. I want to help the world to develop more usable, secure, privacy-preserving, trustworthy software. Education 2021 Doctor of Philosophy (Software Engineering), Carnegie Mellon University School of Computer Science, Pittsburgh. Institute for Software Research, Committee: Norman Sadeh (Chair), Lorrie Faith Cranor, Alessandro Acquisti, Rebecca Weiss (External, Mozilla), Yaxing Yao (External, UMBC) + My research is focused on Usable Privacy and Security, incorporating qualitative and quantitative (mixed-methods) methodologies seen in behavioral economics, user-centered design, requirements engineering, machine learning, and empirical software engineering. + My thesis investigates a broad cross section of privacy and security decisions in browsers and mobile apps; systematically assessing their effectiveness and manageability, exploring standardization, discussing public policy issues, and generalizability to other domains (e.g., Internet of Things). + My work demonstrates that when the settings are well-aligned with people’s mental models, machine learning can leverage the predictive power in models of more complex settings to help people manage their preferences more easily – this can effectively mitigate trade-offs between accuracy and increased user burden as settings proliferate. 2018 Master of Science (Software Engineering), Carnegie Mellon University, Pittsburgh, Institute for Software Research. 2014 Bachelor of Engineering (Honours, Software Engineering), Ontario Tech, Formerly: University of Ontario Institute of Technology, Oshawa, With Distinction. Academic Work Experience 2014 – Present PhD Candidate, Carnegie Mellon University, Pittsburgh, Institute for Software Research. 2017 – 2021 Research Advisor, Carnegie Mellon University, Pittsburgh, Institute for Software Research. -
What to Know About the General Data Protection Regulation (GDPR)? FAQ
What to know about the General Data Protection Regulation (GDPR)? FAQ www.bitkom.org What to know about the General Data Protection Regulation (GDPR)? 2 Imprint Publisher Bitkom e. V. Federal Association for Information Technology, Telecommunications and New Media Albrechtstraße 10 | 10117 Berlin Contact Sausanne Dehmel | Member of the Executive Board responsible for security and trust T +49 30 27576 -223 | [email protected] Graphics & Layout Sabrina Flemming | Bitkom Coverimage © artjazz – fotolia.com Copyright Bitkom 2016 This publication constitutes general, non-binding information. The contents represent the view of Bitkom at the time of publication. While great care is taken in preparing this information, no guarantee can be provided as to its accuracy, completeness, and/or currency. In particular, this publication does not take into consideration the specific circumstances of individual cases. The reader is therefore personally responsible for its use. Any liability is excluded. What to know about the General Data Protection Regulation (GDPR)? 3 Table of Contents Table of Contents Introduction ________________________________________________________________ 4 When does the GDPR enter into force? __________________________________________ 5 How does the GDPR apply in Germany? _________________________________________ 5 Which rules remain largely unchanged? _________________________________________ 7 Which are the new elements and most important changes? ________________________ 9 Which processes and documents do I have to review? ____________________________ 17 What are the costs and effort I should expect? ___________________________________ 18 Which company departments should be informed about the changes? ______________ 18 Who provides guidance on interpretation? ______________________________________ 19 What to know about the General Data Protection Regulation (GDPR)? 4 FAQ Introduction The extensive provisions of the General Data Protection Regulation (GDPR) cause small and medium-sized enterprises (SMEs) initial difficulties. -
Schrems II Compliant Supplementary Measures
Schrems II Compliant Supplementary Measures 11 November 2020 © 2020 Anonos www.anonos.com 1 Summary The attached documents serve as evidence of the feasibility and practicability of the proposed measures enumerated in the EDPB’s Recommendations 01/2020 on Measures That Supplement Transfer Tools to Ensure Compliance With the EU Level of Protection of Personal Data; the first document, prepared by Anonos, describes several use cases, and the others include independent audits, reviews and certifications of Anonos state-of-the-art technology that leverages European Union Agency for Cybersecurity (ENISA) recommendations for GDPR compliant Pseudonymisation to enable EDPB proposed measures. Table of Contents 1. Maximizing Data Liquidity - Reconciling Data Utility & Protection 3 2. IDC Report - Embedding Privacy and Trust Into Data Analytics Through 20 Pseudonymisation 3. Data Scientist Expert Opinion on Variant Twins and Machine Learning 29 4. Data Scientist Expert Opinion on BigPrivacy 38 Anonos dynamic de-identification, pseudonymization and anonymization systems, methods and devices are protected by an intellectual property portfolio that includes, but is not limited to: Patent Numbers: CA 2,975,441 (2020); EU 3,063,691 (2020); US 10,572,684 (2020); CA 2,929,269 (2019); US 10,043,035 (2018); US 9,619,669 (2017); US 9,361,481 (2016); US 9,129,133 (2015); US 9,087,216 (2015); and US 9,087,215 (2015); including 70 domestic and international patents filed. Anonos, BigPrivacy, Dynamic De-Identifier, and Variant Twin are trademarks of Anonos Inc. protected by federal and international statutes and treaties. © 2020 Anonos Inc. All Rights Reserved. MAXIMIZING DATA LIQUIDITY RECONCILING DATA UTILITY & PROTECTION August 2020 www.anonos.com © 2020 Anonos 3 MAXIMIZING DATA LIQUIDITY RECONCILING DATA UTILITY & PROTECTION EXECUTIVE SUMMARY Eight years of research and development have created a solution that optimizes both data protection and data use to maximize data liquidity lawfully & ethically. -
Systems Security Engineering Cyber Resiliency Considerations for the Engineering of Trustworthy Secure Systems
Draft NIST Special Publication 800-160 VOLUME 2 Systems Security Engineering Cyber Resiliency Considerations for the Engineering of Trustworthy Secure Systems RON ROSS RICHARD GRAUBART DEBORAH BODEAU ROSALIE MCQUAID This document is a supporting publication to the NIST systems security engineering guidance provided in Special Publication 800-160, Volume 1. The content was specifically designed to PRE-RELEASE DRAFT be used with and to complement the flagship systems security NOT FOR DISTRIBUTION engineering publication to support organizations that require cyber resiliency as a property or characteristic of their systems. The goals, objectives, techniques, implementation approaches, and design principles that are described in this publication are an integral part of a cyber resiliency engineering framework and are applied in a life cycle-based systems engineering process. Draft NIST Special Publication 800-160 VOLUME 2 Systems Security Engineering Cyber Resiliency Considerations for the Engineering of Trustworthy Secure Systems RON ROSS Computer Security Division National Institute of Standards and Technology RICHARD GRAUBART DEBORAH BODEAU ROSALIE MCQUAID Cyber Resiliency and Innovative Mission Engineering Department The MITRE Corporation March 2018 U.S. Department of Commerce Wilbur L. Ross, Jr., Secretary National Institute of Standards and Technology Walter Copan, NIST Director and Under Secretary of Commerce for Standards and Technology DRAFT NIST SP 800-160, VOLUME 2 SYSTEMS SECURITY ENGINEERING CYBER RESILIENCY CONSIDERATIONS FOR THE ENGINEERING OF TRUSTWORTHY SECURE SYSTEMS ________________________________________________________________________________________________________________________________________________ Authority This publication has been developed by the National Institute of Standards and Technology to further its statutory responsibilities under the Federal Information Security Modernization Act (FISMA) of 2014, 44 U.S.C. § 3551 et seq., Public Law (P.L.) 113-283.