Applying Tort Law to Fabricated Digital Content Michael Scott Eh Nderson
Total Page:16
File Type:pdf, Size:1020Kb
Utah Law Review Volume 2018 | Number 5 Article 6 12-2018 Applying Tort Law to Fabricated Digital Content Michael Scott eH nderson Follow this and additional works at: https://dc.law.utah.edu/ulr Part of the Computer Law Commons, and the Torts Commons Recommended Citation Henderson, Michael Scott (2018) A" pplying Tort Law to Fabricated Digital Content," Utah Law Review: Vol. 2018 : No. 5 , Article 6. Available at: https://dc.law.utah.edu/ulr/vol2018/iss5/6 This Note is brought to you for free and open access by Utah Law Digital Commons. It has been accepted for inclusion in Utah Law Review by an authorized editor of Utah Law Digital Commons. For more information, please contact [email protected]. APPLYING TORT LAW TO FABRICATED DIGITAL CONTENT Michael Scott Henderson* INTRODUCTION Imagine viewing a video of yourself doing and saying things you have never done or said. This “You” could be in a room you have never been in; it could appear younger or older than you currently are and still seem completely realistic. Such a video might sound as if it should exist solely through the use of advanced CGI or animatronics. However, new technologies are being developed that would allow individuals to fabricate digital media using only a recording and a computer. This technology raises the prospect of “fabricated digital content”1 becoming an easily created and disseminated form of media with the potential of impacting an individual’s image and reputation. On November 2, 2016, software company Adobe demonstrated eleven “experimental technologies” at its “Max 2016” event in San Diego, California.2 One of the technologies demonstrated, titled “Photoshopping Voiceovers,” or “#VoCo,” is designed to provide audio recorders the ability to alter the dialogue of a recording without the need of the original voiceover artist.3 The technology was demonstrated by manipulating an audio recording of actor and comedian Keegan-Michael Key.4 The demonstrator, Adobe developer Zeyu Jin, manipulated the recording,5 using only a keyboard, by switching the order in which words were said,6 as well as adding * © 2018 Michael Scott Henderson. J.D. Candidate, 2019. I would like to thank the S.J. Quinney College of Law, the Utah Law Review staff for their time, and WNYC’s Radiolab for providing the inspiration behind this Note. 1 The phrase “fabricated digital content” relates to digital media, such as a video or audio recording, edited to have different content, but to appear as if it is original, or non- edited. 2 Adobe Communications Team, Let’s Get Experimental: Behind the Adobe Max Sneaks, ADOBE BLOG (Nov. 4, 2016), https://blogs.adobe.com/conversations/2016/11/lets- get-experimental-behind-the-adobe-max-sneaks.html [https://perma.cc/95T6-DWXD]. 3 For example, if a new word or phrase was added into a script, a voiceover artist would have to be recorded saying the new word or phrase. Id. 4 Adobe Creative Cloud, #VoCO. Adobe MAX 2016 (Sneak Peeks), YOUTUBE (Nov. 4, 2016), https://www.youtube.com/watch?v=I3l4XLZ59iw&feature=youtu.be&list=PLD8A My73ZVxVLnQh5m-qK0efH3rKIYGx2 [https://perma.cc/S5CY-D7SU]. 5 In the original recording Keegan-Michael Key says, “I jumped out the bed, and I kissed my dogs and my wife, in that order.” Id. 6 The first manipulation changed the recording to say, “. and I kissed my wife and my wife.” Id. 1145 1146 UTAH LAW REVIEW [NO. 5 new words and phrases into the recording.7 The new recording, though manipulated, sounded mostly organic, even though the added phrases were not part of the original recording. Researchers at the University of Erlangen-Nuremberg and Stanford University are developing technology, similar to Photoshopping Voiceovers, that allow users to manipulate video recordings.8 The technology, called Face2Face, allows actors “to animate the facial expressions” of individuals in a video and then “re-render the manipulated output video in a photo-realistic fashion.”9 Thus, individuals could take a video recording and manipulate the facial expressions, including the movement of the mouth, and produce a new, visually realistic, recording. Computer scientists at the University of Washington are also developing technologies to alter the composition of videos featuring public figures using artificial intelligence.10 This developing technology would allow users to make it appear that video recordings occurred in a different place and manipulate the age of the speaker.11 Near the end of the Photoshopping Voiceovers demonstration, Jordan Peele, who was present for the demonstration, stated, “if this technology gets into the wrong hands . .”12 Though Peele’s concerns were dissuaded by Jin’s assurances that any audio manipulations could be easily identified,13 technologies like Photoshopping Voiceovers represent genuine cause for concern. For example, an individual utilizing technology such as Photoshopping Voiceovers and Face2Face in combination could fabricate a recording of an individual, whether they be a public or private figure, making false, defamatory, or controversial statements. Given the increased use of social media platforms such as Facebook and Twitter,14 along with 7 The second manipulation changed the recording—meant for comedic affect since Jordan Peele was present for the demonstration—to say, “. and I kissed Jordan three times.” Id. The technology only requires twenty minutes of recorded speech to replicate a person’s voice. Id. 8 Justus Thies et al., Face2Face: Real-time Face Capture and Reenactment of RGB Videos, VISUAL COMPUTING GROUP, http://www.niessnerlab.org/projects/thies2016face. html [https://perma.cc/4WCJ-6P2X] (last visited Oct. 9, 2017). 9 Id. 10 See Aarti Shahani, Computer Scientists Demonstrate the Potential for Faking Video, NPR (July 14, 2017, 4:57 AM), http://www.npr.org/sections/alltechconsidered/2017/07/14/ 537154304/computer-scientists-demonstrate-the-potential-for-faking-video [https://perma. cc/67UC-CK3X] (discussing how “[a] team of computer scientists have figured out how to make words come out of the mouth of former President Barack Obama — on video — by using artificial intelligence”). 11 See id. 12 Adobe Creative Cloud, supra note 4. 13 Zeyu Jin assured Jordan Peele, and the audience, that Adobe had developed means to prevent misuse via a watermark system which would distinguish original recordings from those with manipulations. Id. 14 Combined monthly users among social media platforms is in the billions. Josh Constine, Facebook now has 2 billion monthly users . and responsibility, TECHCRUNCH 2018] TORT LAW TO FABRICATED DIGITAL CONTENT 1147 increased concerns over cyberbullying15 and “fake news,”16 the potential for misuse of these technologies, and the harm they can render, is very real. This Note will seek to examine the potential legal implications of the misuses of digital fabrication technologies and the ways in which the existing legal framework should be altered to allow victims harmed by the misuse of these technologies to recover damages under a “reasonable publisher” standard. Part I will analyze the development of technologies that allow individuals to manipulate photographs, as well as video and audio recordings. Part II will discuss how misuse of media editing technologies have been and are currently being litigated. And Part III will analyze how courts and litigants can apply developed tort law to the misuse of new digital fabrication technologies. I. DEVELOPMENT OF PHOTO, VIDEO, AND AUDIO EDITING TECHNOLOGIES A. The Development of Photo Editing Photo editing developed as a practice long before the advent of the computer and the development of photo editing software.17 Increased access to computers in the 1980s led to the development of photo editing software such as Display18 and (June 27, 2017), https://techcrunch.com/2017/06/27/facebook-2-billion-users/ [https://perma.cc/E2AB-PAA7]. 15 “‘Cyberbullying is when someone repeatedly harasses, mistreats, or makes fun of another person online or while using cell phones or other electronic devices.’ Approximately 34% of the students in our sample report experiencing cyberbullying in their lifetimes.” Justin W. Patchin, 2015 Cyberbullying Data, CYBERBULLYING RES. CTR. (May 1, 2015), https://cyberbullying.org/2015-data [https://perma.cc/7EFN-ANHV]. 16 “Fake news is made-up stuff, masterfully manipulated to look like credible journalistic reports that are easily spread online to large audiences willing to believe the fictions and spread the word.” Angie Holan, 2016 Lie of the Year: Fake News, POLITIFACT (Dec. 13, 2016, 5:30 PM), http://www.politifact.com/truth-o-meter/article/2016/dec/13/2016 -lie-year-fake-news/ [https://perma.cc/4VV9-ZTVR]. Fake news has become increasingly prevalent since the start of the 2016 presidential election. See id. 17 See What Did We Do Before Photoshop?, PBS NEWSHOUR, (Nov. 29, 2012, 10:11 AM), http://www.pbs.org/newshour/art/slide-show-what-did-we-do-before-photoshop/ [https://perma.cc/97MP-HG7N] (“When photography was first introduced in 1839, people wondered how a medium that could render forms and textures with such exquisite detail could fail to register the ever-present element of color. Eager to please potential customers, photographers immediately resorted to manual intervention, enlivening their pictures with powdered pigment, watercolor and oil paint.”). Leaders throughout history have had photos of themselves, or others, edited to enhance their own image, or defame others. See Photo Tampering Throughout History, FOURANDSIX TECH. INC., http://pth.izitru.com/2008_09_00. html