Case 3:17-Cv-01738-JSC Document 1 Filed 03/29/17 Page 1 of 32

Total Page:16

File Type:pdf, Size:1020Kb

Case 3:17-Cv-01738-JSC Document 1 Filed 03/29/17 Page 1 of 32 Case 3:17-cv-01738-JSC Document 1 Filed 03/29/17 Page 1 of 32 1 Jack Stone Diamond Heights 1-101 Saiwai-chou 21-18 % 2 Kanagawa-ken, Chigasaki-shi 3 253-0052, Japan Phone: (81) 070-6951-2337 4 Email: email0stackjones.com 5 UNITED STATES DISTRICT COURT 6 NORTHERN DISTRICT OF CALIFORNIA 7 8 9 Jack Stone, c Cale'J^o. : e 10 11 COMPLAINT Plaintiff, 13 vs 14 15 Facebook Inc. And Subsidiaries (a Consolidated Group), et al. 16 17 18 Defendant 19 20 21 22 1. Par-ties In this Complain-k 23 24 a. Plaintiff. 25 Name: Jack Stone Address: Diamond Heights 1-101 Saiwai-chou 21-18 Kanagawa- 26 ken, Chigasaki-shi 253-0052, Japan 27 Telephone: (81) 070-6951-2337 28 Jack Stone vs. Facebook Inc. r et al. Complaint Page 1 Of 32 Case 3:17-cv-01738-JSC Document 1 Filed 03/29/17 Page 2 of 32 1 b. Defendants♦ 2 Name: Facebook Inc. and Subsidiaries (a Consolidated Group) 3 Address: 1 Hacker Way, Menlo Park, California 94025 Telephone: (1) 650-543-4800 4 Name: Mark Zuckerberg 5 Address: 1456 Edgewood Drive Palo Alto, California 94301 6 Telephone: (1) 650-543-4800 7 Name: Sheryl Sandberg Address: 1 Hacker Way, Menlo Park, California 94025 8 Telephone: (1) 650-543-4800 9 10 2. Jurisdiction 11 a. The plaintiff's case belongs in federal court under 12 13 Federal Question Jurisdiction because the action involves 14 federal laws and federal rights. The federal laws involved in 15 this matter include, 1.) Title 18, U.S. Code Part I, Chapter 33, 16 Section 701, 2.) First Amendment rights to free speech, and 17 access to a public forum under, Rosenherger v. Rector ana 18 19 Visitors of the University of Virginia, 515 U.S. 819 (1995) 20 where the Court held that a public forum does not have to be a 21 physical location, 3.) Fourteenth Amendment broad interpretation 22 of privacy rights. 4. Tort laws including fraud. 23 b. This case also belongs in federal court under Diversity 24 25 Jurisdiction all parties related to this matter do not reside in 26 the same state. The plaintiff resides in Chigasaki, Japan. The 27 defendant's principle place of business, and primary residences 28 Jack Stone vs. Facebook Inc., et al. Complaint Page 2 Of 32 Case 3:17-cv-01738-JSC Document 1 Filed 03/29/17 Page 3 of 32 1 are located in Menlo Park, in the county of San Mateo, 2 California. The amount in controversy is also more than $75,000. 3 4 3. Venue 5 The Northern District of California may hear this cause of 6 7 action because plaintiff's harm arose from conduct committed by 8 the defendants within the city of Menlo Park, located in the 9 county of San Mateo. Further, Facebook Inc. founder, chairman of 10 the board and chief executive officer, Mark Zuckerberg and chief 11 operating officer, director and member of Facebook Inc. equity 12 13 subcommittee, Sheryl Sandberg both have their primary residences 14 located in Menlo Park, which is a city located within the county 15 of San Mateo, in the state of California. 16 17 4 . In-tradistrict Assignment 18 This lawsuit should be assigned to San Francisco/Oakland 19 20 because the defendant's actionable conduct occurred in Menlo 21 Park, a city located inside the county of San Mateo, which is 22 where Facebook's principal place of business, and the primary 23 residences of the defendants named herein are located. 24 25 5. Statement of Facts and Claims 26 27 A. Coercion of private data, privacy violations and national 28 security. Facebook tracks users, reads private messages, cranes Jack Stone vs. Facebook Inc. ^ et al. Complaint Page 3 Of 32 Case 3:17-cv-01738-JSC Document 1 Filed 03/29/17 Page 4 of 32 1 informa-bion from those messages, and thereaf-ber provides that 2 information to marketers, and engages in surveillance for 3 government agencies in violation of Fourth Amendment warrant requirements. 4 5 Sal. On May 5^^, 2015, Facebook Inc. (Facebook) blocked 6 access to a Facebook page, http://facebook.com/stack.jones, 7 which the plaintiff had operated since January of 2009. Blocking 8 access to the page resulted in irreparable harm to the 9 10 plaintiff, including the loss of business contacts. This was not 11 the first time Facebook blocked access to the page in question. 12 Each blocking was committed for the sole purpose of coercing the 13 extraction of private information, including names, addresses, 14 and various forms of identification, including plaintiff's 15 16 photographic images that the company used for facial recognition 17 technology purposes. Thereafter, Facebook provided that private 18 information to third parties, including governmental agencies, 19 without informing the plaintiff, or obtaining informed consent. 20 21 5bl. The blocking of the page that took place on May 5^^, 22 2015 occurred mere minutes after the plaintiff had filed a 23 complaint with the company that photographic images of 24 25 plaintiff's fifteen-month-old infant child, which were placed in 26 a private folder, with restricted access, and not available to 27 the public, were displayed publicly on Facebook. 28 Jack Stone vs. Facebook Inc., et al. Complaint Page 4 Of 32 Case 3:17-cv-01738-JSC Document 1 Filed 03/29/17 Page 5 of 32 1 5cl. Facebook was reckless in allowing the images of the 2 plaintiff's child to be displayed publicly. Regardless, after 3 obtaining notice to remove said images from public viewing, 4 Facebook did not take any action whatsoever to have the 5 6 photographic images of the plaintiff s fifteen-month-old infant 7 child removed from public viewing, but instead responded by 8 blocking plaintiff's access to http://facebook.com/stack.jones. 9 10 5dl. The blocking of the page was retaliatory in nature, 11 and resulted in preventing the plaintiff from obtaining access 12 to more than 1200 contacts, including phone numbers, email 13 addresses, and physical addresses. The blocking also prevented 14 the plaintiff from being able to communicate with business 15 16 entities, family members and friends. 17 5el. The blocking of the page in question also prevented 18 19 the plaintiff from taking the necessary steps to ensure the 20 photographic images of plaintiff's infant child were removed 21 from public viewing, as access to Facebook is blocked from those 22 that do not have a Facebook account. 23 24 5fl. Facebook permitting photographic images of the 25 plaintiff's infant child to be viewed publicly, resulted in an 26 invasion of privacy. Facebook's has a history of disregard for 27 28 end user rights, and the company unceasingly invades the right Jack Stone vs. Facebook Inc. ^ et al. Complaint Page 5 Of 32 Case 3:17-cv-01738-JSC Document 1 Filed 03/29/17 Page 6 of 32 1 Of privacy, and is the subject of numerous class action suits. 2 5gl. The plaintiff contacted Facebook's Appeals Department, 3 4 and was subjected to endless harassment, that included Facebook 5 demanding the plaintiff ^'identify'' himself by providing two 6 copies of United States government issued forms on 7 identification, including passport and other forms of 8 identification in violation of Title 18, U.S. Code Part I, 9 10 Chapter 33, Section 701. 11 5hl. Other forms of identification Facebook demanded 12 13 include copies of military identification, social security 14 cards, green cards, voter identification, driver licenses, 15 credit cards, bank statements, medical records, marriage 16 certificates, insurance cards, paycheck stubs, utility bills, 17 yearbook photos, etc. Facebook demanded photographic images be 18 19 attached to these forms of identification. These demands were 20 legally impossible as nearly none of these forms of 21 identification, including social security cards, or marriage 22 certificates have photographic images imbedded in them. 23 24 5il. Title 18, U.S. Code Part I, Chapter 33, Section 701, 25 makes it a crime punishable by fine and imprisonment of up to 26 six months for each offense for photocopying much of the 27 28 Jack stone vs. Facebook Inc., et al. Complaint Page 6 Of 32 Case 3:17-cv-01738-JSC Document 1 Filed 03/29/17 Page 7 of 32 1 documentation Facebook had demanded the plaintiff turn over to 2 the company. Title 18, U.S. Code Part I, Chapter 33, Section 701 3 does not permit government issued identification photocopied, 4 and turned over to any third party for any purpose whatsoever. 5 6 This would especially apply to Facebook where the company is 7 notoriously known for failing to recognize the right of privacy, 8 and where the company has been providing that private data to 9 marketers, and other entities, including governmental agencies 10 without prior prior knowledge, or consent. 11 12 5jl. The plaintiff refused to be coerced by Facebook into 13 submitting the demanded private data citing. Title 18, U.S. Code 14 Part I, Chapter 33, Section 701, stating that doing so could 15 16 result in the plaintiff being convicted of crimes, and that said 17 convictions could result in both fines and imprisonment of up to 18 six months for each offense. Regardless, Facebook, continued to 19 demand the plaintiff to turn over copies of federally issued 20 forms of identification in violation of Title 18, U.S. Code Part 21 22 I, Chapter 33, Section 701. Facebook continued to refuse to 23 reinstate plaintiff's access to http://facebook.com/stack.jones, 24 and continues to coerce other end users to violate federal law 25 related to Title 18, U.S.
Recommended publications
  • Deepfakes and Cheap Fakes
    DEEPFAKES AND CHEAP FAKES THE MANIPULATION OF AUDIO AND VISUAL EVIDENCE Britt Paris Joan Donovan DEEPFAKES AND CHEAP FAKES - 1 - CONTENTS 02 Executive Summary 05 Introduction 10 Cheap Fakes/Deepfakes: A Spectrum 17 The Politics of Evidence 23 Cheap Fakes on Social Media 25 Photoshopping 27 Lookalikes 28 Recontextualizing 30 Speeding and Slowing 33 Deepfakes Present and Future 35 Virtual Performances 35 Face Swapping 38 Lip-synching and Voice Synthesis 40 Conclusion 47 Acknowledgments Author: Britt Paris, assistant professor of Library and Information Science, Rutgers University; PhD, 2018,Information Studies, University of California, Los Angeles. Author: Joan Donovan, director of the Technology and Social Change Research Project, Harvard Kennedy School; PhD, 2015, Sociology and Science Studies, University of California San Diego. This report is published under Data & Society’s Media Manipulation research initiative; for more information on the initiative, including focus areas, researchers, and funders, please visit https://datasociety.net/research/ media-manipulation DATA & SOCIETY - 2 - EXECUTIVE SUMMARY Do deepfakes signal an information apocalypse? Are they the end of evidence as we know it? The answers to these questions require us to understand what is truly new about contemporary AV manipulation and what is simply an old struggle for power in a new guise. The first widely-known examples of amateur, AI-manipulated, face swap videos appeared in November 2017. Since then, the news media, and therefore the general public, have begun to use the term “deepfakes” to refer to this larger genre of videos—videos that use some form of deep or machine learning to hybridize or generate human bodies and faces.
    [Show full text]
  • Digital Platform As a Double-Edged Sword: How to Interpret Cultural Flows in the Platform Era
    International Journal of Communication 11(2017), 3880–3898 1932–8036/20170005 Digital Platform as a Double-Edged Sword: How to Interpret Cultural Flows in the Platform Era DAL YONG JIN Simon Fraser University, Canada This article critically examines the main characteristics of cultural flows in the era of digital platforms. By focusing on the increasing role of digital platforms during the Korean Wave (referring to the rapid growth of local popular culture and its global penetration starting in the late 1990s), it first analyzes whether digital platforms as new outlets for popular culture have changed traditional notions of cultural flows—the forms of the export and import of popular culture mainly from Western countries to non-Western countries. Second, it maps out whether platform-driven cultural flows have resolved existing global imbalances in cultural flows. Third, it analyzes whether digital platforms themselves have intensified disparities between Western and non- Western countries. In other words, it interprets whether digital platforms have deepened asymmetrical power relations between a few Western countries (in particular, the United States) and non-Western countries. Keywords: digital platforms, cultural flows, globalization, social media, asymmetrical power relations Cultural flows have been some of the most significant issues in globalization and media studies since the early 20th century. From television programs to films, and from popular music to video games, cultural flows as a form of the export and import of cultural materials have been increasing. Global fans of popular culture used to enjoy films, television programs, and music by either purchasing DVDs and CDs or watching them on traditional media, including television and on the big screen.
    [Show full text]
  • Deepfakes and Cheap Fakes
    DEEPFAKES AND CHEAP FAKES THE MANIPULATION OF AUDIO AND VISUAL EVIDENCE Britt Paris Joan Donovan DEEPFAKES AND CHEAP FAKES - 1 - CONTENTS 02 Executive Summary 05 Introduction 10 Cheap Fakes/Deepfakes: A Spectrum 17 The Politics of Evidence 23 Cheap Fakes on Social Media 25 Photoshopping 27 Lookalikes 28 Recontextualizing 30 Speeding and Slowing 33 Deepfakes Present and Future 35 Virtual Performances 35 Face Swapping 38 Lip-synching and Voice Synthesis 40 Conclusion 47 Acknowledgments Author: Britt Paris, assistant professor of Library and Information Science, Rutgers University; PhD, 2018,Information Studies, University of California, Los Angeles. Author: Joan Donovan, director of the Technology and Social Change Research Project, Harvard Kennedy School; PhD, 2015, Sociology and Science Studies, University of California San Diego. This report is published under Data & Society’s Media Manipulation research initiative; for more information on the initiative, including focus areas, researchers, and funders, please visit https://datasociety.net/research/ media-manipulation DATA & SOCIETY - 2 - EXECUTIVE SUMMARY Do deepfakes signal an information apocalypse? Are they the end of evidence as we know it? The answers to these questions require us to understand what is truly new about contemporary AV manipulation and what is simply an old struggle for power in a new guise. The first widely-known examples of amateur, AI-manipulated, face swap videos appeared in November 2017. Since then, the news media, and therefore the general public, have begun to use the term “deepfakes” to refer to this larger genre of videos—videos that use some form of deep or machine learning to hybridize or generate human bodies and faces.
    [Show full text]
  • Estimating Age and Gender in Instagram Using Face Recognition: Advantages, Bias and Issues. / Diego Couto De Las Casas
    ESTIMATING AGE AND GENDER IN INSTAGRAM USING FACE RECOGNITION: ADVANTAGES, BIAS AND ISSUES. DIEGO COUTO DE. LAS CASAS ESTIMATING AGE AND GENDER IN INSTAGRAM USING FACE RECOGNITION: ADVANTAGES, BIAS AND ISSUES. Dissertação apresentada ao Programa de Pós-Graduação em Ciência da Computação do Instituto de Ciências Exatas da Univer- sidade Federal de Minas Gerais – Depar- tamento de Ciência da Computação como requisito parcial para a obtenção do grau de Mestre em Ciência da Computação. Orientador: Virgílio Augusto Fernandes de Almeida Belo Horizonte Fevereiro de 2016 DIEGO COUTO DE. LAS CASAS ESTIMATING AGE AND GENDER IN INSTAGRAM USING FACE RECOGNITION: ADVANTAGES, BIAS AND ISSUES. Dissertation presented to the Graduate Program in Ciência da Computação of the Universidade Federal de Minas Gerais – De- partamento de Ciência da Computação in partial fulfillment of the requirements for the degree of Master in Ciência da Com- putação. Advisor: Virgílio Augusto Fernandes de Almeida Belo Horizonte February 2016 © 2016, Diego Couto de Las Casas. Todos os direitos reservados Ficha catalográfica elaborada pela Biblioteca do ICEx - UFMG Las Casas, Diego Couto de. L337e Estimating age and gender in Instagram using face recognition: advantages, bias and issues. / Diego Couto de Las Casas. – Belo Horizonte, 2016. xx, 80 f. : il.; 29 cm. Dissertação (mestrado) - Universidade Federal de Minas Gerais – Departamento de Ciência da Computação. Orientador: Virgílio Augusto Fernandes de Almeida. 1. Computação - Teses. 2. Redes sociais on-line. 3. Computação social. 4. Instagram. I. Orientador. II. Título. CDU 519.6*04(043) Acknowledgments Gostaria de agradecer a todos que me fizeram chegar até aqui. Àminhafamília,pelosconselhos,pitacoseportodoosuporteaolongodesses anos. Aos meus colegas do CAMPS(-Élysées),pelascolaborações,pelasrisadasepelo companheirismo.
    [Show full text]
  • Artificial Intelligence: Risks to Privacy and Democracy
    Artificial Intelligence: Risks to Privacy and Democracy Karl Manheim* and Lyric Kaplan** 21 Yale J.L. & Tech. 106 (2019) A “Democracy Index” is published annually by the Economist. For 2017, it reported that half of the world’s countries scored lower than the previous year. This included the United States, which was de- moted from “full democracy” to “flawed democracy.” The princi- pal factor was “erosion of confidence in government and public in- stitutions.” Interference by Russia and voter manipulation by Cam- bridge Analytica in the 2016 presidential election played a large part in that public disaffection. Threats of these kinds will continue, fueled by growing deployment of artificial intelligence (AI) tools to manipulate the preconditions and levers of democracy. Equally destructive is AI’s threat to deci- sional and informational privacy. AI is the engine behind Big Data Analytics and the Internet of Things. While conferring some con- sumer benefit, their principal function at present is to capture per- sonal information, create detailed behavioral profiles and sell us goods and agendas. Privacy, anonymity and autonomy are the main casualties of AI’s ability to manipulate choices in economic and po- litical decisions. The way forward requires greater attention to these risks at the na- tional level, and attendant regulation. In its absence, technology gi- ants, all of whom are heavily investing in and profiting from AI, will dominate not only the public discourse, but also the future of our core values and democratic institutions. * Professor of Law, Loyola Law School, Los Angeles. This article was inspired by a lecture given in April 2018 at Kansai University, Osaka, Japan.
    [Show full text]
  • Bad Actors: Authenticity, Inauthenticity, Speech, and Capitalism
    ARTICLES BAD ACTORS: AUTHENTICITY, INAUTHENTICITY, SPEECH, AND CAPITALISM Sarah C. Haan* ABSTRACT “Authenticity” has evolved into an important value that guides social media companies’ regulation of online speech. It is enforced through rules and practices that include real-name policies, Terms of Service requiring users to present only accurate information about themselves, community guidelines that prohibit “coordinated inauthentic behavior,” verification practices, product features, and more. This Article critically examines authenticity regulation by the social media industry, including companies’ claims that authenticity is a moral virtue, an expressive value, and a pragmatic necessity for online communication. It explains how authenticity regulation provides economic value to companies engaged in “information capitalism,” “data capitalism,” and “surveillance capitalism.” It also explores how companies’ self-regulatory focus on authenticity shapes users’ views about objectionable speech, upends traditional commitments to pseudonymous political expression, and encourages collaboration between the State and private companies. The Article concludes that “authenticity,” as conceptualized by the industry, is not an important value for users on par with privacy or dignity, but that it offers business value to companies. Authenticity regulation also provides many of the same opportunities for viewpoint discrimination as does garden-variety content moderation. * Associate Professor of Law, Washington and Lee University School of Law. The Author thanks Carliss Chatman, Rebecca Green, Margaret Hu, Lyman Johnson, Thomas Kadri, James Nelson, Elizabeth Pollman, Carla L. Reyes, Christopher B. Seaman, Micah Schwartzman, Morgan Weiland, participants in the W&L Law 2019 Big Data Research Colloquium, and participants in the 2019 Yale Freedom of Expression Scholars Conference (“FESC VII”), for their insightful comments on early drafts of this Article.
    [Show full text]
  • FROM EDITORS to ALGORITHMS a Values-Based Approach to Understanding Story Selection in the Facebook News Feed
    Running head: DEVITO / FROM EDITORS TO ALGORITHMS 1 FROM EDITORS TO ALGORITHMS A Values-Based Approach to Understanding Story Selection in the Facebook News Feed Michael A. DeVito Department of Communication Studies Northwestern University Evanston, IL, USA [email protected] http://www.mikedevito.net/ This is a pre-print version of an article published that has been accepted for publication in Digital Journalism. For quotation, please consult the final, published version of the article at http://dx.doi.org/10.1080/21670811.2016.1178592. Please cite as: DeVito, M. A. (2016) From Editors to Algorithms: A values-based approach to understanding story selection in the Facebook news feed. Digital Journalism Ahead of print. doi: 10.1080/21670811.2016.1178592 ACKNOWLEDGEMENTS The author thanks Eileen Emerson for her research assistance, as well as Kerric Harvey, David Karpf, and Emily Thorson of The George Washington University for their guidance and support during the initial research on this piece. The author also thanks the anonymous reviewers, William Marler, and the members of Aaron Shaw’s “Bring Your Own Research” group at Northwestern University for their feedback on the manuscript. FUNDING This research was partially supported by a thesis grant from The George Washington University’s School of Media and Public Affairs, with which the author was previously affiliated. DEVITO / FROM EDITORS TO ALGORITHMS (PRE-PRINT) 2 FROM EDITORS TO ALGORITHMS A Values-Based Approach to Understanding Story Selection in the Facebook News Feed Facebook’s News Feed is an emerging, influential force in our personal information flows, especially where news information is concerned.
    [Show full text]
  • Founder Mark Zuckerberg, Eduardo Saverin, Dustin Moskovitz, Chris Hughes
    Facebook Facebook, Inc. Type Private Founded Cambridge, Massachusetts[1] (2004) Founder Mark Zuckerberg, Eduardo Saverin, Dustin Moskovitz, Chris Hughes Headquarters Palo Alto, California, U.S., will be moved to Menlo Park, California, U.S. in June 2011 Area served Worldwide Mark Zuckerberg (CEO), Chris Cox (VP of Product), Sheryl Sandberg (COO), Donald E. Key people Graham (Chairman) Net income N/A Website facebook.com Type of site Social network service Registration Required Users 600 million[5][6] (active in January 2011) Launched February 4, 2004 Current status Active Facebook (stylized facebook) is a social network service and website launched in February 2004, operated and privately owned by Facebook, Inc. As of January 2011, Facebook has more than 600 million active users. Users may create a personal profile, add other users as friends, and exchange messages, including automatic notifications when they update their profile. Additionally, users may join common interest user groups, organized by workplace, school, or college, or other characteristics. The name of the service stems from the colloquial name for the book given to students at the start of the academic year by university administrations in the USA to help students get to know each other better. Facebook allows anyone who declares themselves to be at least 13 years old to become a registered user of the website. Facebook was founded by Mark Zuckerberg with his college roommates and fellow computer science students Eduardo Saverin, Dustin Moskovitz and Chris Hughes. The website's membership was initially limited by the founders to Harvard students, but was expanded to other colleges in the Boston area, the Ivy League, and Stanford University.
    [Show full text]
  • CS 182: Ethics, Public Policy, and Technological Change
    Rob Reich CS 182: Ethics, Public Policy, and Mehran Sahami Jeremy Weinstein Technological Change Hilary Cohen Housekeeping • Recording of information session on Public Policy memo available on class website • Also posted on the website is a recent research study on the efficacy of contact tracing apps (if you’re interested) Today’s Agenda 1. Perspectives on data privacy 2. Approaches to data privacy • Anonymization • Encryption • Differential Privacy 3. What can we infer from your digital trails? 4. The information ecosystem 5. Facial recognition Today’s Agenda 1. Perspectives on data privacy 2. Approaches to data privacy • Anonymization • Encryption • Differential Privacy 3. What can we infer from your digital trails? 4. The information ecosystem 5. Facial recognition Perspectives on Data Privacy • Data privacy often involves a balance of competing interests • Making data available for meaningful analysis • For public goods • Auditing algorithmic decision-making for fairness • Medical research and health care improvement • Protecting national security • For private goods • Personalized advertising • Protecting individual privacy • Personal value of privacy and respect for individual – thanks Rob! • Freedom of speech and activity • Avoiding discrimination • Regulation: FERPA, HIPAA, GDPR, etc. – thanks Jeremy! • Preventing access from “adversaries” Today’s Agenda 1. Perspectives on data privacy 2. Approaches to data privacy • Anonymization • Encryption • Differential Privacy 3. What can we infer from your digital trails? 4. The information ecosystem 5. Facial recognition Anonymization • Basic idea: drop personally identifying features from the data Name SS# Employer Job Title Nationality Gender D.O.B Zipcode Has Condition? Mehran XXX-XX- Stanford Professor Iran Male May 10, 94306 No Sahami XXXX University 1970 Claire XXX-XX- Google Inc.
    [Show full text]
  • Predicting User Interaction on Social Media Using Machine Learning Chad Crowe University of Nebraska at Omaha
    University of Nebraska at Omaha DigitalCommons@UNO Student Work 11-2018 Predicting User Interaction on Social Media using Machine Learning Chad Crowe University of Nebraska at Omaha Follow this and additional works at: https://digitalcommons.unomaha.edu/studentwork Part of the Computer Sciences Commons Recommended Citation Crowe, Chad, "Predicting User Interaction on Social Media using Machine Learning" (2018). Student Work. 2920. https://digitalcommons.unomaha.edu/studentwork/2920 This Thesis is brought to you for free and open access by DigitalCommons@UNO. It has been accepted for inclusion in Student Work by an authorized administrator of DigitalCommons@UNO. For more information, please contact [email protected]. Predicting User Interaction on Social Media using Machine Learning A Thesis Presented to the College of Information Science and Technology and the Faculty of the Graduate College University of Nebraska at Omaha In Partial Fulfillment of the Requirements for the Degree Master of Science in Computer Science by Chad Crowe November 2018 Supervisory Committee Dr. Brian Ricks Dr. Margeret Hall Dr. Yuliya Lierler ProQuest Number:10974767 All rights reserved INFORMATION TO ALL USERS The quality of this reproduction is dependent upon the quality of the copy submitted. In the unlikely event that the author did not send a complete manuscript and there are missing pages, these will be noted. Also, if material had to be removed, a note will indicate the deletion. ProQuest 10974767 Published by ProQuest LLC ( 2019). Copyright of the Dissertation is held by the Author. All rights reserved. This work is protected against unauthorized copying under Title 17, United States Code Microform Edition © ProQuest LLC.
    [Show full text]
  • Face Recognition and Privacy in the Age of Augmented Reality
    Journal of Privacy and Confidentiality (2014) 6, Number 2, 1{20 Face Recognition and Privacy in the Age of Augmented Reality Alessandro Acquisti∗, Ralph Grossy, and Fred Stutzmanz 1 Introduction1 In 1997, the best computer face recognizer in the US Department of Defense's Face Recognition Technology program scored an error rate of 0.54 (the false reject rate at a false accept rate of 1 in 1,000). By 2006, the best recognizer scored 0.026 [1]. By 2010, the best recognizer scored 0.003 [2]|an improvement of more than two orders of magnitude in just over 10 years. In 2000, of the approximately 100 billion photographs shot worldwide [3], only a negligible portion found their way online. By 2010, 2.5 billion digital photos a month were uploaded by members of Facebook alone [4]. Often, those photos showed people's faces, were tagged with their names, and were shared with friends and strangers alike. This manuscript investigates the implications of the convergence of those two trends: the increasing public availability of facial, digital images; and the ever-improving ability of computer programs to recognize individuals in them. In recent years, massive amounts of identified and unidentified facial data have be- come available|often publicly so|through Web 2.0 applications. So have also the infrastructure and technologies necessary to navigate through those data in real time, matching individuals across online services, independently of their knowledge or consent. In the literature on statistical re-identification [5, 6], an identified database is pinned against an unidentified database in order to recognize individuals in the latter and as- sociate them with information from the former.
    [Show full text]
  • Algorytm Edge Rank Serwisu Facebook: Narodziny, Rozwój I Działanie W Uj Ęciu Teorii Aktora-Sieci
    71 Michał Pałasz Uniwersytet Jagiello ński w Krakowie Wydział Zarz ądzania i Komunikacji Społecznej Instytut Kultury e-mail: [email protected] Algorytm Edge Rank serwisu Facebook: narodziny, rozwój i działanie w uj ęciu teorii aktora-sieci Abstrakt Artykuł wychodzi od omówienia metodologii badania (teoria aktora- sieci, autoetnografia), a nast ępnie przedstawia rozwój serwisu Facebook w latach 2004-2018 w perspektywie narodzin i przemian algorytmu kształtu- jącego „aktualno ści” ( News Feed ), okre ślone kluczow ą innowacj ą platformy, po czym w konkluzjach syntetyzuje rozpoznane translacje i modus operandi głównego aktora. Tekst powstał na bazie badania przeprowadzonego na po- trzeby wyst ąpienia autora w ramach II Ogólnopolskiej Interdyscyplinarnej Konferencji Naukowej „TechSpo’18: Władza algorytmów?”, zorganizowanej przez Wydział Humanistyczny Akademii Górniczo-Hutniczej w Krakowie (Kraków, 20-21 wrze śnia 2018). Słowa kluczowe: algorytm Edge Rank, Facebook, media społeczno ściowe, News Feed, teoria aktora-sieci, zarz ądzanie mediami. Wst ęp Artykuł prezentuje rezultaty prowadzonej w duchu teorii aktora-sieci (ANT, ang. actor-network theory ) eksploracji przemian serwisu Facebook w latach 2004-2018, w efekcie których powstał i w których bierze aktywny, sprawczy udział algorytm, okre ślany jako Edge Rank , EdgeRank , Ranking bądź po prostu algorytm serwisu Facebook czy te ż algorytm Facebooka. Mil- cz ąco towarzyszy on w momencie pisania tych słów ponad dwóm miliardom użytkowników serwisu, którzy korzystaj ą z niego co najmniej raz w miesi ącu (Facebook Newsroom 2018a), a ponadto, m.in.: • decyduje, które komunikaty w ramach serwisu docieraj ą do których u żyt- kowników, tworz ąc ba ńki informacyjne (por. Pariser 2011), jak Blue Feed, Red Feed ( „Kanał niebieski, kanał czerwony” – je śli nie zaznaczono ina- czej, tłum.
    [Show full text]