INF529: Security and In Informatics The Future of Privacy Near and far

Prof. Clifford Neuman

Lecture 15 1 May 2020 Online via Webex Course Outline

• What data is out there and how is it used • Technical means of protection • Identification, , Audit • The right of or expectation of privacy • Social Networks and the social contract – February 21st • Criminal law, National Security, and Privacy – March 6th • Big data – Privacy Considerations – March 13th • International law, Jurisdiction, Privacy Regulations • Privacy Regulation (civil) and also Healthcare – April 3rd • The Internet of Things – April 10th • Technology – April 17th • Elections, Politics, Other Topics – April 24th • The future – Near and Far – may 1st Today’s Presentations

Biometrics and related technologies • Vaidhyanathan S - Privacy Concerns for • Yi-Ting Lin - Privacy of Facial Recognition • Haotian Mai - Access and use of DNA database BY VAIDHYANATHAN SWAMINATHAN ▪ What constitutes Biometric data? ▪ How are Biometric data collected and stored? ▪ Biometric data collection and storage by the U.S. Government. ▪ Biometric data collection and storage by Private entities ▪ Privacy concerns ▪ Biometric Information Privacy Act ▪ Recommendations to improve Privacy and Security. “Biometrics” is the primary term for body measurements and calculations involving those measurements. They are metrics related to human characteristics and they are typically identified and utilized on an individual basis.

Biometric data points can include things like:

▪ Fingerprints ▪ Body odor ▪ DNA ▪ Palm veins ▪ Face recognition ▪ Ear form ▪ Palm prints ▪ Keyboard strokes ▪ Iris recognition ▪ Gait analysis ▪ Hand geometry ▪ Voice ▪ Retina ▪ Body geometry ▪ Biometric data is primarily used for identification but also for authentication. ▪ Hardware-based recognition system - Data is stored on a specific piece of hardware and works with the device to recognize the data, without storing the data on the device itself. ▪ Portable token system - A fob or a smart card to store biometric data. When using this method, the user will need to present their card or fob and then their biometric data as a two-step authentication process. ▪ Biometric server- Data is held on an external server and it is more susceptible to cyber attacks. To reduce the risk of data being breached, it must be encrypted when being transferred over the network. The issue with is deciding where encryption keys will be stored and who will be trusted with access. ▪ Use of biometric data isn’t new and the police have been fingerprinting people for over a century and have had biometric databases since the ’80s. ▪ The U.S. Department of Homeland Security (DHS) takes approximately 300,000 fingerprints per day from non-U.S. citizens crossing the border into the United States, and it collects biometrics from noncitizens applying for immigration benefits and from immigrants who have been detained. ▪ Any individual who applies for employment with the federal government or a sensitive position requiring a background check will be asked to supply a fingerprint ▪ State and local law enforcement officers regularly collect fingerprints and DNA, as well as face prints and even iris scans. ▪ As of January 2020, it is legal in 46 states for software to identify an individual using images taken without consent while they are in public. ▪ Any individual who applies for a driver’s license will provide a face-recognition ready photographs. ▪ Recent advances in camera and surveillance technology have improved the accuracy of biometrics capture and identification at a distance, making unobtrusive collection easier. ▪ Private and public security cameras in use by police are more capable of capturing the facial features to support facial recognition-based searches. ▪ Two of the world’s largest biometrics databases are the FBI’s Integrated Automated Fingerprint System (IAFIS) and DHS’s Automated Biometric Identification System (IDENT). ▪ IAFIS includes over 71 million subjects in the criminal master file and more than 33 million civil fingerprints. ▪ IDENT stores biometric and biographical data for individuals who interact with the various agencies under the DHS umbrella and contains over 130 million fingerprint records on its files. ▪ In addition to the federal databases, each of the states has its own biometric databases – generally a fingerprint database and a DNA database. ▪ Facebook has one of the best-known private biometrics database. ▪ Facebook’s face recognition service allows users to find and tag their friends which has seen dramatic increases in accuracy due to the volume of photos uploaded and tagged on Facebook. ▪ Facebook currently has over 845 million monthly active users and requires each one of those users to sign up under their real names, and then makes its users’ names and primary photos public by default. ▪ “Google Photos” which houses over a million of pictures uses similar technology that extracts and analyzes data from the points and contours of faces that appear in photos taken on Google Android devices. ▪ Template that Google extracts is unique to an individual, in the same way that a fingerprint or voiceprint uniquely identifies one and only one person and is used to organize and group together photos based upon the individuals appearing in the photos.

▪ What does the Government say? Biometrics databases can be used effectively for border security, to verify employment, to identify criminals, and to combat terrorism. ▪ What does the Private Organizations say? Biometrics can enhance our lives by helping us to identify our friends more easily and by allowing us access to places, products, and services more quickly and accurately. ▪ Biometrics’ biggest risk to privacy comes from the government’s ability to misuse it for surveillance. ▪ The problems are multiplied when biometrics databases are “multimodal,” allowing the collection and storage of several different biometrics in one database and combining them with traditional data points like name, address, social security number, gender, race, and date of birth. ▪ Geolocation tracking technologies built on top of large biometrics collections could enable constant surveillance. And if the government gets its way, all of this data could be obtained without a warrant and without notice or warning. ▪ Standardization of biometric databases causes additional problems as the data once standardized becomes much easier to use as linking identifiers, not just in interactions with the government but also across disparate databases and throughout the society. ▪ Large standardized collections of biometrics could lead to many vulnerable copies of that linked data that could wind up in the hands of identity thieves. ▪ Biometric data compromises would be catastrophic as unlike a credit card number or a , your biometric data can’t be revoked or re-issued. ▪ Extensive data retention times can lead to additional problems. Biometric records stored in IDENT are retained for 75 years or until the statue of limitations for all criminal violations has expired. ▪ Civil fingerprints stored in IAFIS are not destroyed until the individual reaches 75 years of age and the criminal fingerprints are not destroyed until the individual reaches 99 years of age. ▪ Due to the usage of advanced facial recognition technologies in crowd and security cameras, anyone could end up in the database, even if they aren’t involved in a crime. ▪ By happening to be in the wrong place at wrong time. ▪ By fitting a stereotype that some in society have decided is a threat.

▪ Data sharing can also mean that data collected for non-criminal purposes, such as immigration-related records, is combined with and being used for criminal or national-security purposes with little to no standards, oversight, or transparency. ▪ If any of the data in the system is inaccurate and propagated throughout several other systems, it can be extremely difficult to correct. ▪ A private entity in possession of biometric identifiers or biometric information must develop a written policy, made available to the public, establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information when the initial purpose for collecting or obtaining such identifiers or information has been satisfied or within 3 years of the individual's last interaction with the private entity, whichever occurs first. ▪ Absent a valid warrant or subpoena issued by a court of competent jurisdiction, a private entity in possession of biometric identifiers or biometric information must comply with its established retention schedule and destruction guidelines. ▪ No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information, unless it first informs the subject or the subject's legally authorized representative in writing ▪ That a biometric identifier or biometric information is being collected or stored; ▪ The specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used; ▪ and receives a written release executed by the subject of the biometric identifier or biometric information or the subject's legally authorized representative.

▪ No private entity in possession of a biometric identifier or biometric information may sell, lease, trade, or otherwise profit from a person's or a customer's biometric identifier or biometric information. ▪ No private entity in possession of a biometric identifier or biometric information may disclose, redisclose, or otherwise disseminate a person's or a customer's biometric identifier or biometric information unless: ▪ the subject of the biometric identifier consents to the disclosure or redisclosure; ▪ the disclosure or redisclosure completes a financial transaction requested or authorized by the subject ▪ the disclosure or redisclosure is required by State or federal law or municipal ordinance; or ▪ the disclosure is required pursuant to a valid warrant or subpoena issued by a court of competent jurisdiction. ▪ A private entity in possession of a biometric identifier or biometric information shall: ▪ store, transmit, and protect from disclosure all biometric identifiers and biometric information using the reasonable standard of care within the private entity's industry; and ▪ store, transmit, and protect from disclosure all biometric identifiers and biometric information in a manner that is the same as or more protective than the manner in which the private entity stores, transmits, and protects other confidential and sensitive information. ▪ The EU data privacy law defines biometric data as "special categories of " and prohibits its "processing." ▪ More precisely, biometric data are “personal data resulting from specific technical processing relating to the physical, physiological, or behavioral characteristics of a natural person, which allows or confirms the unique identification of that natural person, such as facial images or fingerprints data." ▪ The Regulation protects EU citizens and long-term residents from having their information shared with third parties without their consent. However, it does contain some exceptions: ▪ If consent has been given explicitly ▪ If biometric information is necessary for carrying out obligations of the controller or the data subject in the field of employment, social security and social protection law ▪ If it's essential to protect the vital interests of the individual and he/she is incapable of giving consent ▪ If it's critical for any legal claims ▪ If it's necessary for reasons of public interest in the area of public health. ▪ CCPA definition of biometric data is a bit broader than that of GDPR: “an individual’s physiological, biological or behavioral characteristics, including an individual’s DNA, that can be used, singly or in combination with each other or with other identifying data, to establish individual identity.” ▪ The CCPA defines biometric information as one of the categories of personal information protected by the law. The rights provided to California consumers to protect their personal information and biometric data include: ▪ Accessing the data (right of disclosure or access), ▪ Deleting them (right to be forgotten), ▪ Taking them (data portability – the data must be received in a commonly used and readable format), ▪ Requesting businesses not to sell their personal information, ▪ Opting out (Opt-in is the primary consent standard mandated by European GDPR), ▪ Right of action (penalties). ▪ If an organization does not implement a reasonable security program to protect that data and suffers a breach, they can be subject to a class action under the private right of action with statutory damages of between $100 and $750 per consumer per incident. ▪ Biometric template can be transformed using a transformation function and only the transformed template is stored in the database. ▪ Further protection by adding initialization vector or salt to the transformation function. ▪ Make the transformation function as a one-way function so that it becomes computationally difficult to invert a transformed template to the original template even if the transformation is known. ▪ Biometric templates should be transformed in such a way that it doesn’t allow cross-matching across databases. ▪ Limit the collection of Biometrics. ▪ Define clear rules on the Legal Process required for collection ▪ Limit the combination of more than one Biometric in a single database. ▪ Limit retention. ▪ Define clear rules for use and sharing. ▪ Enact robust security procedures to avoid data compromise. ▪ Define and standardize audit trails. ▪ Ensure independent oversight. ▪ https://us.norton.com/internetsecurity-iot-biometrics-how-do-they-work-are-they- safe.html ▪ https://odinlaw.com/what-is-biometric-data-and-how-is-it-legally-protected/ ▪ https://www.bayometric.com/biometric-template-security/ ▪ https://almas-industries.com/blog/how-is-biometric-data-stored/ ▪ https://www.eff.org/press/releases/fingerprints-dna-biometric-data-collection-us- immigrant-communities-and-beyond ▪ https://www.eff.org/issues/biometrics ▪ https://www.biometricupdate.com/202002/google-hit-with-new-biometric-data- privacy-class-action-under-bipa ▪ https://www.biometricupdate.com/201610/facebook-claims-it-can-collect-user- biometric-data-without-consent ▪ http://www.ilga.gov/legislation/ilcs/ilcs3.asp?ActID=3004&ChapterID=57 ▪ https://www.clarip.com/data-privacy/ccpa-biometric-information/ ▪ https://www.thalesgroup.com/en/markets/digital-identity-and- security/government/biometrics/biometric-data THANK YOU Biometrics: Concerns and Use Cases Patcharapon Nuimark | INF 529 | May 1st, 2020 What is biometrics? • Highly Pertinent Means of identification and authentication

• Authentication based on recognizable data which are unique to them

• Identification based on biometric data such as face photo, voice, fingerprint Two categories of biometrics • Physiological measurements

• Fingerprints, Iris or retina, Facial Shape, also the DNA

• Behavioral measurements

• Voice recognition, Signature, Keystroke Benefits of biometrics

• Compared to other kinds of identification

• Biometric provides user the most convenient way to identify

• It is considered a very accurate and hard to forged method

• Biometric data is less often be forgotten, exchanged, stolen or forged

• Probability of finding two similar biometrics is considered very low Use cases of biometrics • Back in old days

• Predominantly used by authorities for military, access control, criminal identification. With a tightly regulated regulation and framework

• Nowadays

• Many banking, retails and mobile commerce equip biometrics

• All modern smartphones also do so Law enforcement and public security • Referring to application systems which support law enforcement

• Automated Fingerprint and Biometric Identification System

• Create and store biometric info for forensic analysts

• Live Face recognition

• Major or crowd area is gaining interest for public security Military

• The United States military has been collecting biometric data since 2009

• 7.4 million identities in the database came from military operation in Iraq and Afghanistan

• From 2008-2017, the DoD has arrested or eliminated more than 1,700 individuals based on biometric matches

• On the 2019, biometric identification has been widely used to identify non- U.S. citizens on the battlefield. Border control, travel, and migration • e-Passports: contains an electronic chip which hold same information that is printed on the passport’s data page. Including two fingerprints and face photo

• Over 1.2 billion of e-passport provides a windfall for automatic border control system

• The photo speeds up the process by using the comparison of embed face or fingerprints. Voter registration

• One person, One vote principle

• Compensate the lack of a mechanism to authenticate voter

• Using only photos requires more resources

• Guarantees of the single enrollment on voter lists

• Can be easily done by Automated Fingerprint Identification System Physical and logical access control • In many IT firms, physical and logical access control can be implemented by using biometric method as Identity and Access Management (IAM) policies.

• In mobile world, Many Apple phones now equipped with at least one biometric mean of user authentication. (Fingerprint or Face recognition)

• As well as Android phone which same brand also combined the iris (eye) scanning Multimodal Biometrics

• Combines several biometric sources to foster security and accuracy

• Usually requires two biometric credentials for identification

• They can overcome common limitation in unimodal systems

• Help reduce error rate considerably Advantages of biometrics • Universal

• Unique

• Permanent

• Recordable

• Measurable

• Forgery-proof How accurate is biometrics • Fingerprints • About 30 minutiae (specific points) are being captured by today’s standard fingerprint reader • FBI evidenced that no two individuals can have more than 8 minutiae in common • Facial Recognition • By NIST, sample of 26.6 million faces, expected only 4% of failure • It is considered to be improved 20 times over 4 years Biometrics reliability

• Biometric authentication relies on statistical algorithms

• It cannot be 100% reliable when used alone

• False rejection and False Acceptances

• Occurs when biometric fail to identify as intended

• Challenges on transformation of analog to digital for comparison Challenges

• Cost of Hardware

• Hardware error, reliability

• Privacy of storage

• Integration into security system THANK YOU DNA Data And

Haotian Mai Spring 2020 INF 529 - Privacy and Security in Informatics CriminalFinal Project Investigation Crime-scene 44 DNA Evidence

• Since the advent of DNA testing in 1985, biological material (skin, hair, blood and other bodily fluids) has emerged as the most reliable physical evidence at a crime scene. • In 1987, serial rapist Tommy Lee Andrews in Florida became the first American ever convicted in a case involving DNA evidence. • DNA technology have only been used to solve criminal cases 33 years ago, but it have completely revolutionized modern crime investigations. It wasn’t that 45 long ago…

• Steven Avery cases • Making a Murderer — Netflix • 1985 Case: He was wrongfully convicted in of secual assault and attempted murder for 18 years but then proven innocent by DNA evidence in 2003. • 2005 Case: while he was in the middle of a $36 million lawsuit against Manitowoc County, the former sheriff and DA for wrongful conviction and imprisonment… He was arrested again for the murder of Teresa Halbach and sentenced to life imprisonment.

• Evidence? Halbach's vehicle was found partially concealed in the Avery’s family’ salvage yard, and bloodstains inside matched Avery's DNA. But they might be planted due to the discovery that Avery’s blood samples from the 1985 case in the police’s lab has been unknowingly taken out by needles…(So many twists, he’s still appealing for a retrial) • Crime scene DNA evidence has its limits. pattern

46 Introduction — We Will Find You

Golden State Killer (Last month offered to plead guilty)

• East Area Rapist, Original Night Stalker, East Bay Rapist… • A Navy veteran and former police officer • A serial killer, serial rapist, and burglar who committed at least 13 murders, more than 50 rapes, and over 100 burglaries in California from 1974 to 1986 • Can’t find the killer for years… • In 2018, Joseph James DeAngelo was found and charged • How? • One of his very distant relatives did a commercial DNA testing • … 47 Development Of Technologies

• Genetic genealogy.

— The use of DNA testing in combination with traditional genealogical and historical records to infer the relationship between individuals.

• In 2019, American law enforcement agencies have identified over 70 suspects (many cold cases) using this new technique called genetic genealogy Growing Trend — Commercial DNA Testing

• Estimates put together by the MIT Technology Review, claims that in 2019 about 26 million testing kits were bought so far, with some projecting that within 2-years that number could sky-rocket to 100 million. Growing Trend — Law Enforcement

• More than 1.2 million people have added the results of their DNA tests to GEDmatch • Recall in the Golden State Killer case, the police had to check a box claiming the uploaded DNA sample is from a commercial testing results of their own and the search is only to be used to find relatives. • But them in 2018, GEDmatch was ought by Verogen — a company that aids law-enforcement agencies with forensic DNA work. • GEDmatch with DNA data of 1.3 million individuals. likely only encompasses about 0.5% of the U.S. adult population.

• But in the Golden Killer case, this is potentially enough identification of about 60% of Americans of European descent. even though they have never provided their own DNA to an ancestry database. —Yaniv Erlich, a computational geneticist at Columbia University. Discussion 50 — Support

Decades old cold cases got solved 1 • Since the Golden State Killer case, police nationwide have made dozens of arrests using the same techniques. Many that had been considered cold.

May discourage to commit crimes 2 • Lower the entire crime rates with more and more people knowing someone else’s DNA testing can out a criminal offender in the family. Discussion 51 — Concerns

Privacy and ethical issues 1 • Since the 2018 Golden State Killer case the F.B.I. and state law enforcement agencies have been cultivating growing databases of DNA not just from convicted criminals, but also in some cases from people accused of crimes.

Breaking the convention 2 • Traditionally, in the realm of scientific and medical research, people who provide samples have clear expectations for “informed consent.” Now someone use at-home DNA testing to find relatives can also out someone for criminal activity without neither’s consent.

Potential False Positives 3 • What if the commercial DNA testing samples is contaminated? • What if human mistakes cause modeling errors in building these genetic connections? Current policy and regulation

• A state representative in Utah introduced a bill that would ban genetic genealogy searches by police. • A Maryland lawmaker introduced a bill to regulate searches — after a proposal last year to ban them failed. • In New York, a state senator has proposed a policy to allow the searches. • A Washington state proposal would allow only searches requested through a valid legal process. • A state senator in California is introducing legislation designed to provide more oversight over direct-to- consumer genetic testing companies.

• But still, a wild west. pattern

References 53

1.We will find you: DNA search used to nab Golden State Killer can home in on about 60% of white Americans, Science Magazine

2. Consumer Genetic Testing Is Gaining Momentum, statista.com

3.About half of Americans are OK with DNA testing companies sharing user data with law enforcement, Pew Research Center

4.California senator proposes tighter regulations on direct-to-consumer genetics testing companies, tech cruch

5.Experts outline ethics issues with use of genealogy DNA to solve crimes, Reuters

6.DNA Databases Are Boon to Police But Menace to Privacy, Critics Say, Pew

7.23andMe CCPA Policy

8.NYPD Overhauls Rules for DNA Evidence in Criminal Cases, Wall Street Journal

9.The Messy Consequences of the Golden State Killer Case, The Atlantic Privacy of Facial Recognition

May 1st INF 529 Spring Yi-Ting Lin Outline

Privacy policy: Privacy technology: UK street incident Law Enforcement: California decision on facial recognition • Police body camera • San Francisco, Oakland, New • Surveillance camera Hampshire, and Oregon Company employee system (Intel) China social credit system(Includes AI, Big data analyzation) Facial recognition collides privacy

• Facial recognition camera had been installed on the street in US and Britain. • Less freedom society • Britain has the 2nd most surveillance cameras, over 6 million surveillance cameras. FYI, UK population is 6 million. • UK has the highest proportion of camera per person • Other countries: China has the most surveillance camera in the world. • People feel being spy by simple surveillance camera; not to mention, surveillance camera with facial recognition technology left people with zero-privacy. UK citizen got pull over walking down the street

• Citizen got pull over or disorderly conduct • He pulled his jumper over his chin. • The man was later surrounded and grabbed by the police; then, the police demanded to acknowledge why the man cover his face and ask for his ID. • Later, the police took a photo of him on their mobile device for the “facial recognition” reason and issued him £90 fine with disorder behavior. • At this time, though there is no fixed camera with “facial recognition”, “handheld, mobile device” with facial recognition technology is in usage. China mobile phone restriction

• China makes facial recognition mandatory for mobile phone user • In Dec 2nd 2019, every new SIM cards must submit user facial recognition scans • In September, telecoms companies required to deploy “AI and other technologies” to identify the identities of the user. • Starting at Dec 1st, registered with passport and identity card • Many described as “dystopian surveillance state” China Social Credit Score

• Social credit system utilized numerical score on the citizen as reward/punishment. • Punishment: playing loud music/eating in rapid transit, jaywalking • Reward: donating blood, donating charity, and volunteers. • People start acting behavior or making up incident to achieve higher score. • Incorporates facial recognition system, AI , and data analyzation in the Big database. China Social Credit Score

• China Social Credit Score is designed by Sesame Credit score, run by Jack Ma’s Alibaba. • Contains roughly 500 million user base. • Credit scoring from 350 to 950. • Affect: raising fears broadly “Orwellian Big Brother” by an regime. • The Late show with Stephen Colbert: China assign its citizens credit score and gets one himself . California Police surveillance

• In Oct 10th 2019, California government announce banning biometric surveillance technology on body camera. • The footage recorded in body camera would later run through facial recognition software. • Implementing facial recognition technology on other cameras are allowed. • Starts from January 1st 2020; Prohibit Length: 3 years(2020-2022) San Francisco Police Surveillance

• Other states prohibit the facial recognition usage in police body camera: • Oregon, New Hampshire; cities: San Francisco, Oakland • In May 2019, San Francisco bans facial recognition. • The civil liberty group afraid the potential abuse by the police and other agencies. • In 2019, CES (The global stage for innovation), attendees interacting with the facial recognition technology in Las Vegas. Intel’s employee system

• Intel is the world's largest semiconductor company and the first company to introduce x86-based processors • It is headquartered in Santa Clara, California. • Utilize facial-recognition technology at their campuses at Oregon • To identify high risky individual, who might pose threat to the company. • The company claim that the “security and safety” of the workers is their first priority. • Some claims the practices put the workers in difficult condition to opt out of facial scans. • FYI, license plate reader is set at the same time in February 2020. NEC facial recognition 2020 Olympic

• NEC(Nippon Electric Company) planned to implement their NeoFace facial recognition in 2020 Olympic. • In June 2017, Intel had signed up as a top-tier partnership with IOC(International Olympic committee) • Intel FPGA(field-programmable gate array) supports NEC in face recognition technology. • The facial recognition will be used to identify the people in the stadium, including athletes, volunteers, media and the staffs. Information usage of facial recognition

• US and UK: • Law enforcement surveillance: • Surveillance camera, includes body camera • Intel: • Company employee system • Japan: • 2020 Olympic facial recognition • China: • Social credit system • Identity Identification in register SIM card and mobile system • Cons: • Pros: Pros and • If not properly handled with • More security right policy/mechanism, technology society gradually become mechanism. Cons under oppressive atmosphere. • Through facial • What if Government utilize recognition the facial recognition surveillance technology to over camera to prevent disclosure/manipulate terrorist attack. citizen’s privacy for • Example: In front inappropriate reason? of the Cardiff City’s stadium, two surveillance vans equipped with facial recognition patrolling. Future

Revised current policy Improving facial recognition accuracy

Current law should be revised. On this early stage, there are tons of false alert. UK incident - over oppressive surveillance. But once the database grown with constantly revised facial recognition technology under • Aggressively forcing a citizen, who felt his artificial intelligence algorithm. privacy got invaded, is never a good solution. Building trust between the emerging technology, the citizen and the law enforcement. It’s citizen’s responsibility to always supervise the appropriate technology usage from the government. Reference

• Intel FPGA Technology Supports NEC in Face Recognition Technology, Intel, June 27, 2017, https://newsroom.intel.com/news/intel-fpga-technology-supports-nec-face-recognition-technology/#gs.4p88rm • Intel starts using facial recognition technology to ID workers, visitors, Oregon Tech, Mike Rogoway, Mar 11, 2020, https://www.oregonlive.com/silicon-forest/2020/03/intel-starts-using-facial-recognition-technology-to-scan-workers- visitors.html • California Becomes Third State to Ban Facial Recognition Software in Police Body Cameras, Security Today, Haley Samsel, Oct 10, 2019, https://securitytoday.com/articles/2019/10/10/california-to-become-third-state-to-ban-facial- recognition-software-in-police-body-cameras.aspx • 3-year ban on police use of facial recognition technology in California to start in the new year, The San Diego Union Tribune, Katty Stagell, DEC. 20, 2019, https://www.sandiegouniontribune.com/news/public-safety/story/2019-12- 20/3-year-ban-on-police-use-of-facial-recognition-technology-in-california-to-start-in-the-new-year • San Francisco Bans Facial Recognition Technology, The New York Times, Kate Conger, Richard Fausset and Serge F. Kovaleski, May 14, 2019, https://www.nytimes.com/2019/05/14/us/facial-recognition-ban-san-francisco.html • A Big Test of Police Body Cameras Defies Expectations, The New York Times, Amanda Ripley, Oct. 20, 2017, https://www.nytimes.com/2017/10/20/upshot/a-big-test-of-police-body-cameras-defies-expectations.html More on the Pandemic

Recent discussion regarding contact tracing https://www.latimes.com/politics/story/2020-04-26/privacy-americans-trade-off-trace-coronavirus-contacts WASHINGTON — It is a big promise from Silicon Valley to a nation looking for ways to be freed from home confinement: Smartphones could discreetly detect those who may have COVID-19 and nudge them to quarantine, blunting renewed outbreaks as Americans start to once again venture out. But as tech firms lay the foundation for a potentially massive digital contact-tracing infrastructure, Washington is grappling with whether such technology can work without becoming a hulking, invasive surveillance system.

Coronavirus and app downloads: what you need to know about protecting your privacy, The Guardian, 3/31/20 This article cautions readers to pay close attention to permissions they give to apps that they use and download. It also discusses the use of the Australian contact tracing app. The Australian government said that their privacy laws allow jurisdictions to share personal information during this pandemic, but urges the local governments to handle personal information with the utmost caution. -Carlin Cherry Final Exam

The Final exam for INF529 will be held Friday May 8th (Next Friday) from 11AM to 1PM PDT. – You will received a dropbox link in advance of the exam. – At 11AM two files will appear in the drobox (and copies will also be emailed) containing: • A word document with the exam • A text document with the exam – You will complete the exam in word or any other text editor of your choice and email the exam back by 1:15 PM (15 minutes added for logistics, etc) – The exam will include a self certification that you neither gave nor received any assistance during the exam. Mid-term Exam Discussion – Q1

Expectations of Privacy (30 points) Although intended by US courts to apply to governments’ access to business records, the “third party doctrine” is actually a very accurate statement of what happens to our data today when we provide the data to third parties. Although privacy regulations and communicated privacy policies may tell us otherwise, there is always the danger (and expectation) that our data will get out if we provide it to third parties. In this question I am concerned with out actual expectation of privacy, specifically with respect to whom our information may be provided (including for government and commercial purposes) and how it may be used. a) List some data (or actions that you might take) that leave you with zero expectations of privacy with respect to the data you have provided. For some of the examples of this “non- private” data, can you think of examples of ways that we expect the data should not be used.| (10 points) b) Explain some of the changes to technology that have resulted in the disclosure of data that is ‘non-private’ (i.e. in plain view) having a significant impact on our privacy when combined with similar data. (10 points) c) Provide several examples (possibly from current events) where data that has been entrusted to a third party (and which is not in plain view) has been used in unexpected ways, violating the user’s privacy expectations. (10 points) Mid-term Exam Discussion – Q2

The most common manner by which adversaries steal our personal data is through impersonation. When our data is stored on our local device, or on the servers of social media and cloud services, the data is supposed to be accessed only by authorized users. If an adversary can pose as a different user for the purpose of making requests, then they can use the privileges associated with the identity that has been impersonated. (30 points) a) List some of the ways that an adversary is able to make request posing as a different user. There are at least two significantly different ways that this can be accomplished. (10 points) b) What are some of the approaches by which you can mitigate the impact of such impersonation activities? (by mitigate, I specifically do NOT mean prevent the impersonation from occurring, what I mean is that you should take steps to ensure the least resulting impact when impersonation does occur). (10 points) c) What are the three main approaches that computer systems can use to confirm identity (authentication). What are the tradeoffs between the different approaches and what steps can a system designer or an end user take to improve the effectiveness of the authentication process? (20 points) Mid-term Exam Discussion – Q3

One definition of privacy discussed in class is that privacy is the right to be let alone. By its nature, social media is intrusive. In this question I want you to discuss some of the ways that social media is problematic for our security and privacy. a) When we use social media, we voluntarily provide information to that reveals our most sensitive characteristics: our likes and dislikes, who our friends are (i.e. our social network itself), our daily schedule, planned travels, the food we eat, etc. Discuss some of the ways that this data is used (i.e. for the purposes that we choose to use these sites) and mis-used (how the sites use this data in ways that are not necessary to the benefit of the end-user, including ways that the user is “monetized”). (10 points) b) Discuss some ways that social media may be used to control us, i.e. how can it more effectively influence our actions and our speech than other media? Consider how it can also affect the actions and speech of those that are not actively using such social media platforms. Specifically consider the incentives and disincentives provided through the social media platform itself. (10 points) c) Social media platforms touch many of our devices and our social media “timelines” are often integrated with our other internet connected activities. How does this integration with our “timeline” affect the privacy of the data associated with our other activities? (10 points) Other Privacy Preserving Technologies

File Sharing End to end Encryption Anonymization TOR More on the Dark Web File Sharing • Freenet, bitTorrents, and related protocols and applications support the decentralized storage and distribution of files on the internet. • Originally intended to provide repositories for data that could not be “silenced”, the content of files are spread across many servers, with duplicate pieces. These pieces are reassembled when users request access to the files. • They are often used to share protected content in violation of copyright. Bittorrent (figure from Wikipedia) • Dangers to users of file sharing services: • Most are configured by default to make your machine a distribution point. Download a file, and other may get that file from you. • Or worse, files you never requested can be loaded onto your computer and retrieved by others. • Comparison with TOR File Encryption

• There are many tools and packages available to encrypt individual files or entire drives. Among these are the whole drive encryption discussed in the intro class, but software tools are also available. • PGP file encrypt – part of the PGP package discussed earlier allows encryption of files or folders using the public key of an intended recipient (or yourself). • TrueCrypt was for some time the best option for file encryption, but the last release removed the ability to encrypt files, and was accompanied by statements urging that it not be used. It is widely believed that the previous version is safe. This Week The Dark Web • Readings: – Time Magazine The Secret Web: Where Drugs, Porn and Murder Live Online November 11, 2013. – It’s About To Get Even Easier to Hide on the Dark Web, Wired 1/28/2017. – https://www.vice.com/en_us/article/ezv85m/problem-the- government-still-doesnt-understand-the-dark-web – US government funds controversial Dark Web effort Anonymization

• For internet communication (email, web traffic) even if contents are protected, traffic analysis is still possible, providing information about what sites one visits, or information to the site about your identity.

• Tools are available that will hide your addresses – Proxies – Networks of Proxies – Onion Routing and TOR Anonymizer and similar services

• Some are VPN based and hide IP addressed. • Some of proxy based, where you configure your web browser. • Need the proxy to hide cookies and header information provided by browser. • You trust the provider to hide your details. • Systems like TOR do better because you don’t depend on a single provider. TOR • Originally developed by US Navy to protect Internet communications • The problem: • Internet packets have two parts – header and payload • Even if payload is encrypted, header is not • Header lists originator and destination nodes – all nodes along the way can read this information • Why might this be a problem: • Law enforcement may not want it known they are visiting a site • General privacy protection. TOR TOR

• Continued development and improvement with US funding (Dept of State) • SAFER project: • Develop improvements or similar technologies that are less vulnerable to persistent attempts to track users, e.g. dissidents, etc. TOR

From Engadget, 7/28/2014 Russia offers a $110,000 bounty if you can crack Tor Countries that have less-than-stellar records when it comes to dissenting voices must really, really hate Tor. Coincidentally, Russia's Interior Ministry has put out a bounty of around $110,000 to groups who can crack the US Navy-designed privacy network. After the country's vicious crackdown on dissenting voices back in 2012, protestors who hadn't escaped or been jailed began using anonymous internet communication as their first line of defense against the Kremlin. If you're considering taking part in the challenge (and earning yourself a tidy stack of cash to quell your conscious), be warned -- the bounty is only open to organizations that already have security clearance to work for the Russian government. TOR - Fundamentals

• Origin node accesses list of TOR nodes and creates the packet: • Starts by creating a packet consisting of payload and header – header contains desired destination node and final TOR node in zigzag route • Now treats the above packet as a payload and creates a header with origin and destination consisting of two TOR nodes • This is repeated until final packet contains a header with original source node and first TOR node identified • …Hence the term “Onion Routing” TOR - Fundamentals TOR – Fundamentals

Source cybersolutons.ga and yourdictionary.com TOR - Fundamentals

Source Node T T

T T

T

Destination Node TOR - Fundamentals

• List of TOR nodes periodically changes • Zigzag route is periodically changed

• Not totally fool proof: • If non-TOR browser opened within TOR browser, security measures are void – basically going back to “direct routing” • Someone monitoring source and destination node may note synchronization of packets being sent/received. • …to avoid: increase TOR traffic

Deep Web – TOR (These are old addresses)

• TOR (https://www.torproject.org/about/overview.html.en)

• http://deepweblinks.org/ - Lists sites in deep web

• http://ybp4oezfhk24hxmb.onion/ - lists a hitman website

• http://xfnwyig7olypdq5r.onion/ - lists a USA Passport site

• http://jv7aqstbyhd5hqki.onion/ - a hackers site

• http://2ogmrlfzdthnwkez.onion/ - rent-a-hacker

• http://www.infosniper.net/ TorSearch - http://kbhpodhnfxl3clb4.onion/ http://deepweblinks.org/ http://2ogmrlfzdthnwkez.onion/ - use inside TOR http://ybp4oezfhk24hxmb.onion/ - use inside TOR http://xfnwyig7olypdq5r.onion/ - use inside TOR http://jv7aqstbyhd5hqki.onion/ - use inside TOR Discussion

• Readings: – Society deserves privacy, but at what cost. – Who defines “good use” – Dark v. Deep Web – How to control the dark web (technically) INF529: Security and Privacy In Informatics A Bit More on IoT

Prof. Clifford Neuman

Lecture 14 24 April 2020 Online via Webex Good Practices / Isolation

• For manipulators • How we connect – Pairing with local controller – Security of Controller then becomes issue • Local Governor – No override to unsafe states • Problems arise from conflict between always on access and need to protect. • Push data from device, rather than pull/poll. – But that creates power/efficiency issues Accessible Telemetry

• GP Devices (smartphones, tablets laptops) – More vulnerable to malware and other compromise – If compromised can collect event more data than we have configured them to collect. • Telemetry: – Audio, Video, Location, Vibration Major Issues Many Home IoT Devices

• Many of these devices are general purpose – GP interface is hidden, and user only sees application running on top of Linux or other platform. – Many IoT devices are not updated/patched regularly to address new vulnerabilities that are discovered. Or updates occur automatically without permission of owner. – Many devices enable inbound access through your Firewall. – IoT Device is full fleged device on your home network, and if compromised from outside, allows attacker node inside your firewall to attack observe other activity. – Many users leave their devices with the default or access controls. – May devices enable “open access” to users within local network segement. (open or hacked wifi and other IoT devices can be an issue) How easy is it to hack a home network?

Mark Ward - BBC News – 25 February 2016 My home is under attack - Right now, skilled adversaries are probing its defences seeking a way in. They are swift, relentless and smart. No weakness will escape their notice. But I am not without defences. I've tried to harden the most vulnerable devices to stop them being compromised and I've set up warning systems that should alert me if the attackers get inside. In the end, all that effort was for nothing because the attackers found so many ways to get at me and my home network. And, they said, even if the technology had defeated them, the weakest link of all - me - would probably have let them in. Swiss cheese - I found out just how severely compromised my home network was in a very creepy fashion. I was on the phone when the web-connected camera sitting on the window sill next to me started moving. The lens crept round until it pointed right at me. I knew that the attackers were on the other end watching what I was doing, and potentially, listening to the conversation. It is a gadget my children and I have used to see if any wildlife passes through our garden and one which many people have for home security or as an alternative baby monitor. I was lucky that I knew my attackers who, at that moment, were sitting in my living room waiting to show me how straightforward it was to subvert these domestic devices. The picture they took of me via the camera was evidence enough. Standards (or lack of any)

• By default no set rules/standards in designing architecture • Developments from past year https://www.forbes.com/sites/aarontilley/2016/07/27/two- major-internet-of-things-standards-groups-strike- alliance/#1b42c1cd4520

• This year, US Department of Commerce finally took note of the issue that IoT standards cannot be left to market. www.zdnet.com/article/iot-standards-cannot-be-left-to- the-market-us-department-of-commerce/

Slide by Apurv Tiwari

103 Inferences from Home Sensors

• Your daily Routine – When you leave, get home, what is the best time to burglarize your house. • What television programs you watch. – No more “Nielson families” – your TV or set top box collects this data and sends it to your provider. • Power consumption can tell a lot about your activities too. At Work and “On the Road”

• We pair with devices all the time – For printing, beaming data – NFC for payment • Attaching to WiFi Hotspots – We broadcast the SSID’s with which we usually connect. – Evil twin or Rogue free WiFi • Whenever we attach, it creates a path for malware infection, or for data to be collected by “peer”. – E.g. contact list on bluetooh connected audio in rental car. In Our Vehicles

• Our vehicles are part of the IoT – OBDII – Wifi Hotspots – Entertainment systems – Blue tooth connectivity to our cellphones • Discussed earlier – Navigation • Is your car Spying on You – NBC LA – November 15 2015 • Consider multi-step attacks – Cellphone malware – Entertainment - OBDII • Security and Privacy in Personal Devices – (additional elaboration on this topic)

• Warning Over Chinese Mobile Giant Xiaomi Recording Millions Of People’s ‘Private’ Web And Phone Use Forbes, April 30, 2020.

So, more generally, what are the privacy issues around mobile devices. – The smart phone is the most privacy invading device we have. • Phones have access to massive sensitive data (esp location data). • Phone have access to our data in the cloud. • Some of us use simple unlock codes (or none) – We carelessly download numerous apps to our devices. • These apps are sandboxed but gain access to some of this sensitive data – We are careless in terms of the permissions we grant to the apps – Vetting of apps in app stores is not particularly effective – Apps can exploit device OS vulnerabilities to gain additional access – Bluetooth and wired connections (even power) can allow exploits • Juice Jacking Review for Final Exam Material for Review

• Slides from all lectures • Linked documents from web site • Linked documents from emailed assignments • Linked documents from lectures • Current event discussion from class Review from Mid-term

First half of class: • Attacks • Overview of security and privacy – Malicious Code – What are they, why we have neither – Social Engineering – Relationship between the two – Attack Life Cycle

• Understanding our data in the cloud – What data exists and who can access it • Both officially and unofficially • and Privacy – What is the data used for – What can it be potentially used for • Expectations of Privacy (new lecture for this year) • Overview of Technical Security – Confidentiality, Integrity, Availability • Social Media and Social Networks – The role of Policy – Risk Management from multiple perspectives • Mechanisms – Encryption/Key Management, Firewalls, Authentication, Digital Signatures, Authorization, Detection, Trusted hardware Second Half of Course What is Big Data Processing of large and complex data sets. – Often with multiple structures. – Data is mined to find trends, relationships, and correlations. • Danger – By combining information from multiple sources more Inferences are imprecise can be inferred than • The algorithms learn specifically disclosed. discrimination Can algorithms illegally discriminate

CNBC – and Whitehouse report But when it comes to systems that help make such decisions, the methods applied may not always seem fair and just to some, according to a panel of social researchers who study the impact of big data on public and society.

The panel that included a mix of policy researchers, technologists, and journalists, discussed ways in which big data—while enhancing our ability to make evidence-based decisions—does so by inadvertently setting rules and processes that may be inherently biased and discriminatory.

The rules, in this case, are algorithms, a set of mathematical procedures coded to achieve a particular goal. Critics argue these algorithms may perpetuate biases and reinforce built-in assumptions. Also http://www.nextgov.com/big-data/2017/02/cfpb-wants-know-how-alternative-data-changes-credit- scores/135695/ Todays Topic

• The legal and ethical battle between the FBI and Apple of retrieval of data on a cell phone. • Discussion regarding hacking techniques used by intelligence agencies as disclosed by Wikileaks

But before all this: • Some foundations to guide us • Broader societal discussion based on disclosure by Snowden and others in Wikileaks. • And why national security depends on cyber security. Nexus

What is legal depends on the legal system that applies. There must be a nexus, that is a relationship or connection.

Establishing a nexus in the internet is difficult, there will be multiple parties, and multiple places at which activities occur. Much of our Discussion is US Centric

• While one must consider all affected parties, regardless of where they are, discussion of the relevant laws in this lecture will be centered on the US, unless otherwise stated.

• But, our discussion of ethics, will focus on broader principles that are less U.S. centric. Ethical Issues

Apple v FBI Wikileaks Disclosure • Authority to search • Authority to “hack” – Device owned by SB County – Court order based on showing of probably cause. – Genuine Probably Cause exists in this case • Broader separate issue • Broader separate issue – Intentional vulnerabilities (back doors) in phone sold to other customers – Many problems with this Broader Public Policy Issues

Apple v FBI Wikileaks Disclosures • Impact of • Use of existing Required exploits Backdoors • Requirements to provide access • Duty to protect? to existing data. Technical Issues

Apple v FBI Wikileak Disclosures • Data on Phone • IoT Security • • Sensors Everywhere • Security of Software • Upgrades • be applied. Disclosure of Techniques in Legal Proceedings

• In FBI hacks, tech firms get left in the dark as feds resist call to divulge secrets - Los Angeles Times, March 31, 2016. – In US, when evidence is presented in court, defense has opportunity to refute, and due process may require disclosure of methods through which the evidence was collected. – In many cases, this limits the prosecutors ability to present certain pieces of evidence. Tools – A Sampling

• Communication – Email communication – Website “secure email” – PGP / S/MIME – SSL / TLS – Virtual Private Networks • Anonymization – Proxies • Storage Encryption – TOR – Truecrypt • Messaging Apps – Wickr What is Civil Law

• Civil law is concerned with private relations between parties rather than criminal complaints by a government against an individual. – This is in contrast to criminal law. – Includes contract law. – Includes tort law. • If a tort (wrong) is committed we may be able to settle or litigate over actual, punitive, or stipulated damages, for “specific performance”, or injunctive relief. Civil Law and Privacy

• Contracts and privacy and security – Privacy policy statement • Discovery and Privacy • Laws protecting privacy of consumers – HIPAA – FERPA (Buckley Amendment) – Fair Credit Reporting Act – Others – Regulations by FTC (and at one point FCC) – Data Breach Notification Laws Discovery

When bringing suit (litigating) civil matters, all parties have the right to compel disclosure of facts that may benefit their case. – The process of forcing disclosure of such information is called Discovery. – If you are a party to the suit then you may be required to produce “discoverable” information. • A good reason not to keep some things to begin with. • A good reason to have a data retention/destruction policy – It is illegal to destroy the data after you have reason to believe that it will become subject to discovery. • Third party doctrine applies – Data about you may be obtained from third parties – You may have an opportunity to object to such disclosure, but not always. Health Insurance Portability and Accountability Act • Health Insurance Portability & Accountability Act of 1996 (45 C.F.R. parts 160 & 164).

Provides a framework for: • Nationwide protection of patient confidentiality • Security of electronic systems • Standards for electronic transmission of health information. Protected Health Information

Protected Health Information (PHI) is individually identifiable health information that is: – Created or received by health care provider, health plan, employer, or clearinghouse that: • Relates to the past, present, or future physical or mental health or condition of an individual; • Relates to the provision of health care to an individual • Or payment for provision of health care Includes information in health record such as: – Encounter/visit documentation – Lab results – Appointment dates/times – Invoices – Radiology films and reports – History and physicals (H&Ps) – Patient Identifiers • FERPA

• The Family Educational Rights and Privacy Act (FERPA) (20 U.S.C. § 1232g; 34 CFR Part 99) is a Federal law that protects the privacy of student education records. The law applies to all schools that receive funds under an applicable program of the U.S. Department of Education. The law gives parents or "eligible students" (those who are over 18 years old) certain rights with respect to a student's educational records. FERPA

• he Family Educational Rights and Privacy Act (FERPA), a Federal law, requires that Schools, with certain exceptions, obtain your written consent prior to the disclosure of personally identifiable information from your education records. FERPA – Directory Information

• Directory information is information contained in a student's education record that would not generally be considered harmful or an invasion of privacy if disclosed. FERPA requires each institution to define its directory items, and such information may be quite broad. Students can request non-disclosure of such directory information. • USC’s FERPA Notices Fair Credit Reporting Act

• The Fair Credit Reporting Act, 15 U.S.C. § 1681 (“FCRA”) is U.S. Federal Government legislation enacted to promote the accuracy, fairness, and privacy of consumer information contained in the files of consumer reporting agencies. The FCRA regulates the collection, dissemination, and use of consumer information, including consumer credit information. Fair Credit Reporting Act

• Defines purposes for which such reports may be requested. • Imposes requirements on the reporting agencies, users of reports, and furnishers of information. – Including Notice requirements. – Permissible purposes. – Ability to correct information. Federal Trade Commission

• The FTC has been the chief federal agency on privacy policy and enforcement since the1970s, when it began enforcing one of the first federal privacy laws – the Fair Credit Reporting Act. Since then, rapid changes in technology have raised new privacy challenges, but the FTC’s overall approach has been consistent: The agency uses law enforcement, policy initiatives, and consumer and business education to protect consumers’ personal information and ensure that they have the confidence to take advantage of the many benefits of the ever-changing marketplace. – FTC's Privacy Report: Balancing Privacy and Innovation – The Do Not Track Option: Giving Consumers a Choice – Making Sure Companies Keep Their Privacy Promises to Consumers – Protecting Consumers’ Financial Privacy – The Children’s Online Privacy Protection Act (COPPA) Data Breach Notification Laws

• “Forty-seven states, the District of Columbia, Guam, Puerto Rico and the Virgin Islands have enacted legislation requiring private or governmental entities to notify individuals of security breaches of information involving personally identifiable information.” (according to national conference of state legislatures)

• Security breach laws typically apply to particular classes of business and define personally identifiable information such as name combined with SSN, drivers license or state ID, account numbers. They also define what constitutes a breach (e.g., unauthorized acquisition of data); requirements for notice; and exemptions based on whether the information was encrypted, if disclosure would impede law enforcement investigations, etc.

• Federal law is by sector (e.g. FERPA, HIPAA, etc). See comparison of various laws. International Context and Nexus

• Where does an internet activity take place – And by implication, the laws of which jurisdiction apply • If I run a server in California – It can be accessed from anywhere • If my server runs in the cloud – I might not even know where the computation takes place. • If someone in eastern Europe steals funds from your bank account. – Who will catch him and prosecute? Privacy and Free Speech Implications

• Ability of courts to order disclosure of information – From a company based in the same country but for customers located elsewhere. – When the data is stored in another country. – When the company does business in the country requesting the information. – Laws may conflict – you will end up breaking some of them. • When some countries try to limit the what can be said in forums or publications. – Can they apply those laws against web servers hosted elsewhere? Free Speech and Censorship

• In the US, free speech rights dominate. One can speak their opinion and beliefs, and the government can not prevent one from doing so. There are limits such as “shouting fire in a crowded theater”, or speech inciting violence (e.g. telling ones followers to commit a violent crime).

• Other countries have other kinds of specific prohibited speech. – This could be repressive regimes that ban or censor speech that is unflattering of the government or government officials. – It could be prohibitions on “hate” speech, i.e. anything derogatory about particular classes of individuals. – It could be statements or reports on cases pending in the courts. – It could be private information about specific individuals. – Some counties have very specific prohibited topics. – It could be blasphemous speech.

Question: – How can this be handled when the speech in questions occurs outside ones jurisdiction. – Are we reduced to only saying things that are legal in all jurisdictions. Court Ordered Access to Data

Microsoft Wins Appeal on Overseas Data Searches – New York Times NICK WINGFIELD and CECILIA KANGJULY 14, 2016

For the last few years, American technology giants have been embroiled in a power struggle with the United States government over when authorities get to see and use the digital data that the companies collect.

On Thursday, Microsoft won a surprise victory in one such legal battle against the government over access to data that is stored outside the United States.

In the case, the United States Court of Appeals for the Second Circuit reversed a lower court’s ruling that Microsoft must turn over email communications for a suspect in a narcotics investigation stored in a Microsoft data center in Dublin. The case had attracted widespread attention in the technology industry and among legal experts because of its potential privacy implications for the growing business, with implications for internet email and online storage, among other services.

Had the United States government prevailed, Microsoft and others warned, it would set a dangerous precedent that would make it increasingly difficult to resist orders from foreign courts demanding data, such as email from human rights activists or political dissidents. Corporate and government customers abroad also might be unwilling to use cloud services from Microsoft if they thought their data could be seized by American courts, Microsoft said. Nexus for Cyber Crime

Where does a cyber-crime occur, and under who’s laws do we determine what is legal? – Crimes involving victims • Financial crimes • Data theft • Denial of service – Facilitation of activities that are illegal in some jurisdictions • Gambling (free trade issues as well) • Sale/trafficking of illegal goods • What if we don’t know where the server resides – What jurisdictions can issue order for searches, etc. Final Exam Questions (prev years)

• http://csclass.info/USC/INF529/inf529-s19-final.pdf

• http://csclass.info/USC/INF529/inf529-s18-final.pdf

138