Differential Privacy: What, Why and When a Tutorial

Differential Privacy: What, Why and When a Tutorial

Differential Privacy: What, Why and When A Tutorial מוני נאור Moni Naor Weizmann Institute of Science Slides credit: Guy Rothblum, Kobbi Nissim, Cynthia Dwork… Crypto Innovation School (CIS 2018) Shenzhen Nov 29th 2018 What is Differential Privacy? • Differential Privacy is a concept – Motivation – Rigorous mathematical definition – Properties – A measurable quantity • Set of algorithmic techniques for achieving it • First defined in: – Dwork, McSherry, Nissim, and Smith, Calibrating Noise to Sensitivity in Private Data Analysis, Third Theory of Cryptography Conference, TCC 2006. – Earlier roots: Warner, Randomized Response, 1965 Why Differential Privacy? • DP: Strong, quantifiable, composable mathematical privacy guarantee • Provably resilient to known and unknown attack modes! • Theoretically: DP enables many computations with personal data while preserving personal privacy – Practicality in first stages of validation Not a panacea Good References • The Algorithmic Foundations of Differential Privacy Cynthia Dwork and Aaron Roth http://www.cis.upenn.edu/~aaroth/privacybook.html • The Complexity of Differential Privacy, Salil Vadhan • Differential Privacy: A Primer for a Non-technical Audience https://privacytools.seas.harvard.edu/files/privacytools/files/peda gogical-document-dp_new.pdf Privacy-Preserving Analysis: The Problem Data Analysis Outcome Can be Distributed or encrypted! • Given dataset with sensitive personal info Health, social n/w, location, communication, • How to compute and release functions of the dataset Academic research, • While protecting individual privacy informed policy, national security Glorious Failures of Traditional Approaches to Data Privacy • Re-identification [Sweeney ’00, …] • Auditors [Kenthapadi, Mishra, Nissim ’05] • Genome-Wide association studies (GWAS) [Homer et al. ’08] • Netflix Prize [Narayanan, Shmatikov ‘08] • Social networks [Backstrom, Dwork, Kleinberg ‘11] • Attack on statistical aggregates [Dwork, Smith, Steinke, Ullman Vadhan ‘15] The Netflix Prize • Netflix Recommends Movies to its Subscribers – Seek an improved recommendation system – Offered $1,000,000 for “10% improvement” – Published training data Prize won in September 2009 “BellKor's Pragmatic Chaos team” Very influential competition in machine learning From the Netflix Prize Rules Page… • “The training data set consists of more than 100 million ratings from over 480 thousand randomly-chosen, anonymous customers on nearly 18 thousand movie titles.” • “The ratings are on a scale from 1 to 5 (integral) stars. To protect customer privacy, all personal information identifying individual customers has been removed and all customer ids have been replaced by randomly-assigned ids. The date of each rating and the title and year of release for each movie are provided.” Netflix Data Release [Narayanan-Shmatikov 2008] • Ratings for subset of movies and users • Usernames replaced with random IDs • Some additional perturbation Credit: Arvind Narayanan via Adam Smith A Source of Auxiliary Information • Internet Movie Database (IMDb) – Individuals may register for an account and rate movies – Need not be anonymous • Probably want to create some web presence – Visible material includes ratings, dates, comments Use Public Reviews from IMDb.com Alice Bob Charlie Danielle Erica Frank Anonymized Public, incomplete NetFlix data IMDB data Alice Bob Charlie Danielle = Erica Frank Credit: Arvind Narayanan via Adam Smith Identified Netflix Data De-anonymizing the Netflix Dataset Results of which 2 may be completely wrong • “With 8 movie ratings and dates that may have a 3-day error, 96% of Netflix subscribers whose records have been released can be uniquely identified in the dataset.” • “For 89%, 2 ratings and dates are enough to reduce the set of plausible records to 8 out of almost 500,000, which can then be inspected by a human for further deanonymization.” Consequences? Settled, March 2010 – Learn about movies that IMDB users didn’t want to tell the world about... Sexual orientation, religious beliefs US Video Privacy – Subject of lawsuits Protection Act 1988 Credit: Arvind Narayanan via Adam Smith Perfect Privacy? Why not “Semantic Security”? [a la Goldwasser Micali] Anything that can be learned about a participant from sanitized data, can be learned without it [Dalenius77] Unachievable: Auxiliary information is a problem [Dwork Naor] Common theme in privacy horror stories A “New” Approach to Privacy Differential Privacy [DMNS06] Any outcome is equally likely when I’m in the database or out of the database Risk incurred by participation is low Learning Can Hurt q1 a1 q Data 2 a2 … Data Analyst Teachings vs. Participation q1 a1 q Data 2 a2 … Data Analyst Dwork, McSherry Nissim & Smith Differential Privacy 2006 Any outcome is equally likely when I’m in the database or out of the database Algorithm 푨 guarantees 휺-differential privacy if for all DBs 퐷 and all events 푆: 휀 푃푟퐴[퐴(퐷 + 푚푒) ∈ 푆] ≤ 푒 ⋅ 푃푟퐴 퐴 퐷 − 푚푒 ∈ 푆 1 + 휀 Randomness introduced by 퐴 Differential Privacy b1 b2 b3 b= M(b) bn-1 b Neighboring: n M Distributions at One entry “distance” < ε modified b1 b2’ b3 b’= M(b’) bn-1 b n M Slide credit: Kobbi Nissim Dwork, McSherry Nissim & Smith Differential Privacy 2006 Any outcome is equally likely when I’m in the database or out of the database Algorithm 푨 guarantees 휺-differential privacy if for all DBs 퐷 and all events 푆: 휀 푃푟퐴[퐴(퐷 + 푚푒) ∈ 푆] ≤ 푒 ⋅ 푃푟퐴 퐴 퐷 − 푚푒 ∈ 푆 + δ Randomness introduced by 퐴 (휺, δ)-differential privacy Dwork, Kenthapady, McSherry, Mironov and Naor, 2007 Local Model bn n b a 1 bn-1 a2 b2 bn-2 b3 b4 Differential Privacy is a Success • Algorithms in many setting and for many tasks Important Properties: Programmable! • Group privacy: k privacy for a group of size k • Composability – Applying the sanitization several time: graceful degradation – proportional to number of applications – even prop. to squareroot of number of applications. • Robustness to side information Hard to quantify – No need to specify exactly what the adversary knows – Postprocessing Differential Privacy: A Tutorial • Basic composition Answering small numbers of queries • Advanced composition Answering moderate numbers of queries • Coordinated mechanisms Answering huge number of queries • Example of Mixing MPC and DP for passwords Composition Privacy maintained even under multiple analyses Core issue The key to differential privacy’s success! • Unavoidable – In reality, there are multiple analyses • Makes DP “programmable” – Private subroutines make for private algorithms Composition Privacy maintained even under multiple analyses How do we define it? [DworkRothblumVadhan10] • Adaptive, adversarial DBs and algorithms 0 1 (푥1 , 푥1 ), 푀1 Adversary 푏 푏 ∈ {0,1} 푀1(푥1 ) … 0 푏 = 0: real world Views under (푥 , 푥1), 푀 푘 푘 푘 푏 = 1: my data 푏 = 0 / 푏 = 1 푏 replaced with junk are DP 푀푘(푥푘 ) Basic Composition • 푘 (adaptively chosen) algorithms, each 휀0-DP: taken together still 푘 ⋅ 휀0-DP Application: answering multiple queries Basic Composition Proof Define: 푀1,2 푥 = 푀1 푥 , 푀2 푥 Pr[푀1,2 푥 =(푧1,푧2)] Pr 푀1 푥 =푧1 Pr[푀2 푥 =푧2] = ≤ 푒휀1푒휀2 Pr[푀1,2 푦 = 푧1,푧2 ] Pr 푀1 푦 =푧1 Pr[푀2 푦 =푧2] Property of the definition – Independent of the implementation – What about the adaptive case? Statistical queries 푞(퐷) = “how many in 퐷 satisfy predicate 푃?” 푃 is a Boolean predicate on universe 푈 statistical queries allow powerful data analyses • Perceptron, ID3 decision trees, PCA/SVM, k-means [BlumDworkMcSherryNissim05] • any SQ-learning algorithm [Kearns98] – includes “most” known PAC-learning algorithms Data Analysis Model query set Q privacy-preserving Database 퐷 = multi-set Trusted synopsis S Untrusted over universe 푈 Curator accurate on Q Analyst Offline: non-interactive Online: interactive q1 Q a1 S q2 a2 … Answering a single counting query 푈 is set of tuples: (푛푎푚푒, 푡푎푔 ∈ {0,1}) Counting query: # of participants with 푡푎푔 = 1 A: output # of 1’s + noise Differentially private! For proper noise Choose noise from Laplace distribution Laplacian Noise Laplace distribution 푌 = 퐿푎푝 푏 density function 1 Pr 푌 = 푦 = 푒−|푦|/푏 2푏 Standard deviation: 푂(푏) Set 푏 = 1/휀, get that Pr 푌 = 푦 ∝ 푒−휀⋅|푦| -4 -3 -2 -1 0 1 2 3 4 5 Laplacian Noise: 휀-Privacy Take 푏 = 1/휀, get that Pr 푌 = 푦 ∝ 푒−휀⋅|푦| Release: 푞(퐷) + 퐿푎푝(1/휀) For adjacent 푫, 푫’: |풒(푫) – 풒(푫’)| ≤ 1 −휺 For any 풛: 푒 ≤ 푷풓풃풚 푫[풛]/푷풓풃풚 푫’[풛] ≤ 푒 -4 -3 -2 -1 0 1 2 3 4 5 Laplacian Noise: Õ(1/휀)-Error Take 푏 = 1/휀, get that Pr 푌 = 푦 ∝ 푒−휀⋅|푦| 푃푟 [|푦| > 푘 · 1/휀] = 푂(푒−푘) 푦~푌 Expected error is 1/휀, w.h.p error is Õ(1/휀) -4 -3 -2 -1 0 1 2 3 4 5 Scaling Noise to Sensitivity [DMNS06] Global sensitivity of query 푞: 푈푛 → [0, 푛] 퐺푆 = 푚푎푥 |푞(퐷) – 푞(퐷’)| 푞 퐷, 퐷’ For a counting query 푞: 퐺푆푞 = 1 Previous argument generalizes: For query 푞, release 푞 퐷 + 퐿푎푝(퐺푆푞/휀) • 휀-private • error Õ(퐺푆푞/휀) Answering 푘 Queries: Basic Composition Answer 푘 queries, each with sensitivity 1 • Use Laplace with 휀0 = 휀/푘 privacy per query Better privacy, more noise per query (∼ Lap 푘/휀 ) • Composition: 휀-privacy for all 푘 answers Error (roughly) linear in number of queries • E.g.: can answer 푛 queries with 푂෨( 푛 ) error Differential Privacy: A Tutorial • Basic composition Answering small numbers of queries • Advanced composition Answering moderate numbers of queries • Coordinated mechanisms Answering huge number of queries • Example of Mixing MPC and DP for passwords Advanced Composition [DRV10] Composing 푘 algorithms, each 휀0-DP: 1 2 휀푔 = 푂 푘 ⋅ ln ⋅ 휀0 + 푘 ⋅ 휀0 훿푔 with all but 훿푔 probability. Simultaneously Compare with: 휀푔 = 푘 ⋅ 휀0 (basic composition) 2 (think of 푘 < 1/휀0 ) Privacy Loss Fix adjacent 퐷, 퐷′, draw 푦 ← 푀 퐷 Pr 푀 퐷 = 푦 푃푟푖푣푎푐푦퐿표푠푠 푦 = ln Pr[푀 퐷′ = 푦] Can be positive, negative (or infinite) 19 20 Privacy Loss Fix adjacent 퐷, 퐷′, draw 푦 ← 푀 퐷 Pr 푀 퐷 = 푦 푃푟푖푣푎푐푦퐿표푠푠 푦 = ln Pr[푀 퐷′ = 푦] • random variable, has a mean • 휀, 0 − 퐷푃: w.p.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    71 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us