<<

Face/Off: “” Face Swaps and

By: Erik Gerstner

Erik Gerstner is an associate at David, Kamp & Frank, L.L.C. in Newport , Virginia. Erik received his JD from William & Mary School in 2018. He focuses his practice on civil litigation, including business litigation, real estate, personal injury, and reputational injury. This article has been expanded from one the author published in For The Defense.

N 2018, a curious trend spread making it appear as though Cage had rapidly across the Internet: always portrayed those characters.2 I people posting videos of Nicholas Nicholas Cage’s central role in Cage’s performances in various DeepFake videos is fitting, given his Hollywood films.1 To the uninitiated starring role alongside John viewer, these videos might appear Travolta in 1997’s Face/Off, a film in to be nothing special, just various which his and Travolta’s characters facets of Cage’s prolific career. both end up wearing the other’s However, closer inspection would faces throughout the film. Although reveal a subtler thread running it was only a fanciful Hollywood throughout these clips: none of invention in 1997, face swapping these performances actually technology entered the mainstream involved Cage. Rather, thanks to in 2017. In August of that year, relatively new University of (AI)-powered software programs researchers released a video, colloquially known as , seemingly of , Internet users had seamlessly discussing topics such as terrorism, inserted Cage’s face over the faces of fatherhood, and job creation, which the original actors in these scenes, had been created using machine

1 John Maher, This was the year of the https://www.dailydot.com/unclick/nicolas deepfake Nicolas Cage meme, THE DAILY DOT -cage-memes-deepfakes-2018. (Dec. 27, 2018), available at 2 Id. 2 DEFENSE COUNSEL JOURNAL | JANUARY 2020

learning algorithms.3 By 2018, light claims. The first section will similar tools became publicly discuss the software and the available, with the most popular, technology behind it, including a called FakeApp, available for free brief introduction to how it online. FakeApp was developed technically functions. Next, this using ’s open- deep article will discuss the state of the learning software. In its first two relevant law and examine how face months of being publicly available, it swaps have and will continue to was downloaded more than intersect with applicable statutory 120,000 times.4 and case law. Finally, it will discuss The many-faceted ramifications potential judicial and legislative stemming from the widespread solutions to present and future availability of this and other similar problems arising from these sorts of software are staggering. The AI technologies. unprecedented ability to create fabricated messages from I. Fakeapp and Machine politicians and other celebrities, or Learning “” in the parlance of our current political climate, is a major The influx of fake videos stems concern - the Pentagon alone has largely from the widespread already spent tens of millions of availability of simple yet powerful dollars in an effort to research and software tools such as FakeApp. combat DeepFakes.5 However, Utilizing to train although the political and cultural AI, it condenses what would be an ramifications of DeepFakes are exceedingly complex operation for significant, and worthy of even the most experienced digital considerable attention across the artists into a single button press to spectrum of areas that they affect, create a face swapped video. While this article will be limited to having a moderately powerful primarily examining the legal issues computer is a slight barrier to the likely to arise from these programs, effective usage of the program, it including privacy, the right to one’s otherwise is relatively own likeness, and /false uncomplicated to create fake media

3 Jennifer Langston, Lip-syncing Obama: New TIMES (Mar. 4, 2018), at A1, available at tools turn audio clips into realistic video, https://www.nytimes.com/2018/03/04/te UNIVERSITY OF WASHINGTON NEWS (July 11, chnology/fake-videos-deepfakes.html. 2017), available at http://www. 5 Dan Robitzski, Pentagon’s AI Director Calls washington.edu/news/2017/07/11/lip- for Stronger Deepfake Protections, FUTURISM, syncing-obama-new-tools-turn-audio-clips- (Aug, 30, 2019), available at into-realistic-video. https://futurism.com/the-byte/pentagon- 4 Kevin Roose, It Was Only a Matter of Time: ai-director-deepfake-protections. Here Comes an App for Fake Videos, Face/Off: “DeepFake” Face Swaps and Privacy Laws 3

with it.6 In its most basic form, all a operations and requires user needs is a “base” video and a considerable computing power number of source images of the face from any hardware on which it runs. of the person being pasted into the The final video quality is video. The more source images determined by a combination of input into the program, the more factors, including the similarity of seamless the final video will the faces and poses among the base appear.7 video and the source images, and the After creating the datasets, amount of time spent and quality of FakeApp then trains the deep the AI training. What is not a factor, learning algorithm, a process that however, is the software itself – can take hours or even days, computer-generated faces were depending on how powerful a once strictly the domain of big- computer is used and the quality budget studios with deep pockets, sought for the final video. Thereafter, proprietary software tools, and the user needs only to click one considerable amounts of time. For more button to create the resulting example, the much-discussed video. A more experienced creator appearance of a computer- may be able to achieve a higher generated young Carrie Fisher in degree of realism through more Star Wars: Rogue One in 2016 was involved interaction with the the product of a $200 million FakeApp software, but by following production budget, and, according the basic steps, even a novice can to the visual effects supervisor, “a fairly easily create a face swap using super high-tech and labor-intensive the program.8 version of doing makeup.”10 Now, While this process is private individuals are able to create straightforward for the front-end videos equaling or even surpassing user, it is anything but for the those created by these studios for a computer running FakeApp.9 The tiny fraction of the time and software utilizes Google’s open expense.11 Machine learning is the source TensorFlow machine great equalizer: thanks to the learning algorithm to power its powerful AI algorithms powering

6 Roose, supra note 4. audio track, rather than simply 7 Id. superimposing one face over another in an 8 Id. existing video. See Langston, supra note 3. 9 Note that other face swap AI programs do 10 B.J. Murphy, user outperforms not necessarily operate in the same way. For Disney with AI-generated Princess Leia, GRAY example, the University of Washington SCOTT (Jan. 25, 2018), available at algorithm is considerably more in-depth, https://www.grayscott.com/seriouswonde learning what shapes mouths make when r-//reddit-user-outperforms-disney-with- vocalizing certain sounds, then creating ai-generated-princess-leia. video from whole cloth to match a given 11 See, e.g., id. 4 DEFENSE COUNSEL JOURNAL | JANUARY 2020

FakeApp and similar software, those One potential legal concern able to make full use of it have had flowing from these fake images is the power at their fingertips defamation. A defamation cause of enhanced exponentially, a process action could arise from an individual of growth that is likely to continue using FakeApp or similar software as this technology continues to to create a fake video of an progress.12 individual saying or doing something that would injure the II. Privacy, Defamation, and Fake individual’s reputation if it were News: The Present State of the true. For example, in the Law aforementioned video the University of Washington created of There are potential President Obama, the audio could be ramifications flowing from the any recording the creator wants to creation and use of the resulting use, literally putting words of the media that span the legal spectrum, creator’s choosing into Obama’s including ramifications in election mouth, including statements that law, criminal law, evidence, and could be highly offensive to an intellectual property. This article, unsuspecting viewer.14 In states that however, will focus on potential recognize a difference between privacy issues, including defamation, slander and libel, a face swapped false light, and the right of , video could easily give rise to both of also sometimes known as these causes of action. For example, .13 While case law if someone creates a video and statutory law regarding deep purportedly showing Person A fakes currently is scant or saying defamatory things about nonexistent, there are analogues Person B, then Person B might have which may provide some guidance a claim for slander (as the as to how the law in the United defamatory statements were States will address issues arising verbal), while Person A might have a from these new technological cause of action for libel. advancements.

12 Joe McKendrick, More artificial available at https://www.sfchronicle.com/ intelligence, fewer screens: the future of business/article/If-you-think-fake-news-is- computing unfolds, ZDNET (Sept. 9, 2017), bad-fake-video-is- 12751052.php; Ari available at http://www.zdnet.com/article/ Breland, Lawmakers worry about rise of fake artificial-intelligence-the-new-user- video technology, THE HILL (Feb. 19, 2018), interface-and-experience. available at http://thehillcome/policy/ 13 See, e.g., Benny Evangelista, If you think technology/374320-lawmakers-worry- fake news is bad, fake video is coming, SAN about-rise-of-fake-video-technology. FRANCISCO CHRONICLE (March 14, 2018), 14 Langston, supra note 3. Face/Off: “DeepFake” Face Swaps and Privacy Laws 5

While fake media of the sort Defamation is by its nature described above would seem to mutually exclusive of parody. By satisfy the requirements for a definition, defamation requires a defamation claim, there is no false statement of fact; parody, to the common law jurisprudence either degree that it is perceived as parody way for a claim resulting from such by its intended audience, conveys a created video. There is some the message that it is not the original precedent, though: in some states, and, therefore, cannot constitute a the of defamation explicitly false statement of fact.19 applies to altered still images.15 As the above passage states, for Generally speaking, videos are something to be considered parody, treated the same as still images it must be perceived as parody (and under the law.16 Thus, a defamatory thus not as a statement of fact) by its video should be fully actionable if a audience.20 In other words, it must plaintiff attempts to bring suit. not “reasonably be understood as However, there are powerful describing actual facts. . . or events” affirmative defenses for defamation (emphasis added).21 This analysis claims that apply in many cases and would be applied on a case-by-case make it difficult for plaintiffs to win basis, and different finders of fact these lawsuits.17 The primary may come out with dramatically defense for a content creator is a different results based on the facts claim that their face swapped video of each individual case. is parody, which may be an absolute Face swapped videos are also defense in defamation suits.18 likely to trigger the common law tort of intentional infliction of emotional

15 See, e.g., Kiesau v. Bantz, 686 N.W.2d 164, who must prove that publishers of 178 (Iowa 2004) (holding that an altered defamatory content did so with a mens rea image depicting a female police officer in of “.” New York Times Co. v. uniform, standing in front of her official Sullivan, 376 U.S. 254 (1964). However, vehicle, with her breasts exposed, was even for public figures, this defense would libelous per se); Morsette v. “The Final Call,” offer no sanctuary because as face swapped 764 N.Y.S.2d 416 (N.Y. App. Div. 2003) videos are intentionally created to be fake (upholding jury verdict of libel stemming representations of real people, their from altered image of woman in creators have actual knowledge of the falsity, making it appear that she was a convict). with intent going well beyond the actual 16 See, e.g., Arizona v. Steinle, 372 P.3d 939, malice standard. 945 (Ariz. 2016) (stating that for the 18 Hustler v. Falwell, 485 U.S. 46, 46 (1988) purposes of evidence, requirements for (holding that trial court properly dismissed admission of video evidence should be the plaintiff’s defamation claim because content same as for a photo). in question was ruled to be parody). 17 One common defense in modern 19 50 AM. JUR.2d, Libel and Slander § 156 defamation cases is the New York Times v. (2018). Sullivan standard, which applies in cases 20 Id. involving the defamation of public figures, 21 Hustler, 485 U.S. at 46. 6 DEFENSE COUNSEL JOURNAL | JANUARY 2020

distress (IIED) - conduct that causes requirement even for public figures. severe emotional trauma in a victim. Note too that a faked video could Frequently a defamation claim is potentially give rise to IIED suits accompanied by an IIED claim, and from multiple parties stemming they are often adjudicated similarly, from the same video: both the though with certain key purported individual depicted in the differences. 22 Unlike with defa- video, and any recipients who may mation, when the victim of an IIED be shocked by the things the person claim is a private figure, there is no in the video is saying or doing.25 need to analyze whether a faked On the other hand, for any video would qualify as a false defense to an IIED claim should statement, for the only concern with focus first on the intent element. regard to IIED is conduct, false or While there will clearly be intent in not. Parody is not an absolute the creation of the media itself, in defense against IIED either, as many cases it is unlikely that a court parody, even if demonstrably not a will find actual intent to cause false statement of fact, may still rise emotional distress.26 In states where to the level of being patently this is the sole element, this may offensive.23 This is not a blanket rule, prove to be a bar to IIED claims. A however. Similar to defamation, court could very easily hold, IIED claims against public figures do however, that the very creation of a require that there be a false face swapped video, particularly an statement of fact, made with “actual unflattering one, is by its very malice” – that is, with knowledge nature likely to result in emotional that it was false or with reckless distress if published, and thus it is disregard as to whether or not it was per se reckless. This uncertainty true.24 With that said, any creator of makes an IIED claim less favorable a faked video would by definition for a plaintiff in the context of face have actual knowledge that the swaps, but still a valid option in statements made in the video and some cases for those victimized by attributed to the depicted individual this technology. are false, meeting this mens rea

22 See, e.g., Hustler, 485 U.S. at 46; Rykowsky (holding that both subject and recipient of v. Dickinson Public School Dist. No. 1, 508 false images had an actionable claim for N.W.2d 348 (N.D. 1993); Barker v. Huang, IIED). 610 A.2d 1341 (Del. 1992); Lewis v. Benson, 26 This is especially true when it comes to 701 P.2d 751 (Nev. 1985). faked pornography. See Emma Grey Ellis, 23 Hustler, 485 U.S. at 46. People Can Put Your Face On Porn—And The 24 Id.; the Sullivan standard was enumerated Law Can’t Help You, WIRED (Jan. 26, 2018), in Sullivan, 376 U.S. at 279-280. available at https://www.wired.com/story/ 25 See, e.g., Dzamko v. Dossantos, 2013 WL face-swap-porn-legal-limbo. 5969531 (Conn. Super. Oct. 23, 2013) Face/Off: “DeepFake” Face Swaps and Privacy Laws 7

False light is another tort claim, for a plaintiff is met. The other though many states do not elements are much easier to recognize it as a separate cause of establish before a jury, and thus a action.27 Like defamation, false light false light claim, in those states claims arise from the spread of which recognize it, would be a falsehoods about a plaintiff that powerful potential avenue for a would be considered objectionable plaintiff harmed by appearing in a by a . However, fake video. unlike with defamation, false light Defamation, IIED, and false light claims award based on the are quite similar as far as their emotional harm they suffered from respective elements and the types of the spread of the falsehoods.28 One harms they seek to redress. of the four causes of action outlined Returning to Prosser’s four by William Prosser in his seminal categories of privacy , another 1960 article, false light differs from of these that is likely to give rise to defamation and IIED primarily litigation stemming from face because it is a privacy tort, in that it swapped videos: “[a]ppropriation, seeks to protect an individual from for the defendant’s advantage, of the claims about them which are plaintiff’s name or likeness,” released to the public.29 Note a key commonly referred to in the United difference from defamation claims: States as the right of publicity.30 As for false light, truth is not an with false light, not every state affirmative defense; rather, the recognizes the right of publicity, burden is on the plaintiff from the though it is more widespread across outset to establish the false or the country than false light. Because misleading nature of the statement, there is no federal scheme making false light a more difficult protecting this right, it varies by cause of action compared to state. Nonetheless, in its most basic defamation, at least in some form, the tort applies when one jurisdictions. Despite this, however, “appropriates the commercial value the analysis does not change much of a person’s identity by using compared to the other two torts without the person’s name, already discussed – because these likeness, or other indicia of identity videos are, by definition false, what might otherwise be a difficult hurdle

27 Defamation vs. False Light: What is the 28 See False Light, LEGAL Difference?, FINDLAW, available at INSTITUTE, available at https://www.law. http://injury.findlaw.com/torts-and- cornell.edu/wex/false_light. personal-injuries/defamation-vs--false- 29 William L. Prosser, Privacy, 48 CALIF. L. REV. light--what-is-the-difference-.html. 383, 389 (1960). 30 Id. at 389. 8 DEFENSE COUNSEL JOURNAL | JANUARY 2020

for purposes of trade.”31 In many changed to endorse a particular fast states, such as , this food restaurant, or car model or applies not only to an individual’s clothing brand, this would clearly actual image, but also voice, meet the elements of a right of signature or other types of publicity claim. Depending on the likenesses. In , these level of a person’s celebrity, as well protections are extended to, among as how their likeness is being used, other things, distinctive appearance, a right of publicity suit can be quite gestures or mannerisms as well.32 lucrative. For example, in 2015 NBA In those states which do legend Michael Jordan was awarded recognize the right, there are $8.9 million after an court considerable differences in how found that a local supermarket chain they approach it. Some, like had violated his right of California, treat the right of publicity publicity. 35 Because of these similar to a property right. Others, potential judgments against face such as New York, address it as a swap creators, the right of publicity privacy right, more along the lines of is a strong deterrent against media Prosser’s original incarnation.33 creators with deep pockets, such as Thus, depending on where the suit is companies that might be tempted to filed, the results may vary use a celebrity’s likeness for their considerably. With that said, the own commercial ends. It is a right of publicity is one very likely to powerful tool in any individual’s be invoked by victims of face swap (and his/her attorney’s) arsenal exploitation, so long as there is some should they become the subject of a commercial value involved in the face swap gone viral. end usage of the media.34 It is The right of publicity has irrefutable that a face swap, by its another, even stronger application very nature, captures the likeness of as well: a potential cause of action its subject. If, for example, the video against platforms hosting face of Obama previously discussed were swapped videos.36 In those states

31 RESTATEMENT (THIRD) OF UNFAIR COMPETITION 35 Darren Rovell, Supermarket chain must § 46 (AM. LAW INST. 2006). pay Michael Jordan $8.9 million for use of 32 CAL. CIV. CODE § 3344 (West 1984); IND. name, ESPN (Aug. 21, 2015), available at CODE ANN. § 32-36-1-1 (West 2012). http://www.espn.com/nba/story/_/id/134 33 Jonathan Faber, A Brief History of the Right 86052/supermarket-chain-pay-michael- of Publicity, RIGHT OF PUBLICITY (July 21, 2015), jordan-89-million-use-name. For the court’s available at http://rightofpublicity.com/ discussion of this case, and how the right of brief-history-of-rop. publicity applies to the First Amendment, 34 Jesse Lempel, Combatting Deep Fakes see Jordan v. Jewel Food Stores, Inc., 743 F.3d through the Right of Publicity, LAWFARE 509 (7th Cir. 2014). (Mar. 30, 2018), available at 36 Lempel, supra note 34. https://www.lawfareblog.com/combatting- deep-fakes-through-right-publicity. Face/Off: “DeepFake” Face Swaps and Privacy Laws 9

that recognize the right of publicity for them to have to police the as an intellectual property right, millions of posts their users create plaintiffs potentially could sue each day, especially given how , , YouTube, Reddit, seamless these fakes can appear.41 A or other websites hosting this solution might be one similar to the content in addition to the creators Digital Millennium Copyright Act themselves. 37 For the other poten- (DMCA), which allows the owner of tial causes of action discussed, any intellectual property to request that potential liability to platforms hosting platforms remove it; if would be curtailed by the platforms do so, it absolves them of Decency Act, liability.42 However, this would be a which states that “no provider or solution to be created through user of an interactive computer legislative means, rather than service shall be treated as the judicial. publisher or speaker of any information provided by another III. FUTURE LEGAL APPROACHES information content provider.”38 TO FACE SWAPS However, the Act also creates an exception: it does not “limit or For many emerging expand any law pertaining to technologies, the law has no intellectual property.”39 The appli- immediate solution to the unique cation of this so-called “intellectual problems they pose. As a result, property exception” to the right of courts and attorneys must create publicity has not yet been tested in solutions for these problems as they courts, and there are a number of come across their respective obstacles preventing it from dockets and desks. The efficacy of becoming a viable option for this process varies considerably, plaintiffs moving forward.40 The based on the field of law, type of multibillion dollar companies that technology, the specific court these lawsuits might target could hearing the case, and a number of argue persuasively that it is not fair other factors as well. With copyright

37 Id. publicity claims to succeed given the first 38 47 U.S.C. § 230(c)(1) (1998). two issues, and that while the First 39 Id. at § 230(e)(2). Amendment is a more difficult one, case law 40 Lempel, supra note 34. Lempel describes provides plaintiffs with some arguments three basic hurdles: (1) fitting the right of here as well. publicity into the intellectual property 41 Facebook alone sees 300 million photos exception to begin with; (2) meeting the uploaded per day. The Top 20 Valuable “commercial use” requirement for right of Facebook Statistics, ZEPHORIA DIGITAL publicity claims; and (3) overcoming First MARKETING, available at https://zephoria. Amendment hurdles. He concludes that com/top-15-valuable-facebook-statistics. there is significant precedent for right of 42 17 U.S.C. § 512 (2010). 10 DEFENSE COUNSEL JOURNAL | JANUARY 2020

law, for example, many statutes are widespread across society and written for specific technologies, culture. and are poorly suited to deal with Courts, however, are not newer emerging technologies in infallible, especially when it comes which copyrights may be held. 43 to dealing with new, unfamiliar Cable television and other paid TV subjects. In some cases, the common broadcasts are a useful case study: law approach results in inconsistent in the mid-1970s, FCC regulations rulings across different jurisdictions for cable were based on the or even within the same one. traditional transmission of signals, Copyright provides a useful example: via ground wires or microwaves.44 peer-to-peer (P2P) file sharing sites The advent of satellite such as Napster, Grokster, and transmissions were not addressed Kazaa, which allow users to share by these specifically-written files (including but not limited to regulations, however, and as they music, videos, and other media) became more prevalent, courts with other individuals, usually for struck down many of the prior free, via the internet. In 2001, the regulations. This forced the FCC to Ninth Circuit found Napster liable scramble to create an entirely new for contributory and vicarious regime for cable companies by the for enabling 1980s.45 In many cases, courts may this behavior. 46 Three years later, be better-equipped to be on the however, the same court found forefront of emerging technologies, other P2P services, including as they are able to examine these Grokster and Kazaa, not liable for technologies on a case-by-case basis, their users doing essentially the in real-time, as necessary. There is same thing as Napster’s users had an argument that for face swaps the been doing – that is, sharing best solution is to permit courts and copyrighted material with each attorneys to create common law other.47 These issues are exacer- precedents in order to handle the bated when different courts come to risks and issues with the technology radically different conclusions as it and its harms become more about how to handle technology-

43 Jessica D. Litman, Copyright Legislation the Supreme Court, which held that any and Technological Change, 68 OR. L. REV. 275, individual who promotes their product as a 277 (1989). way to infringe copyrights is liable for the 44 Id. at 343. resulting acts of infringements. Metro- 45 Id. at 343-346. Goldwyn-Mayer Studios, Inc. v. Grokster, 46 A&M Records, Inc. v. Napster, Inc., 239 Ltd., 545 U.S. 913 (2005). The comparison, F.3d 1004 (9th Cir. 2001). however, holds – the Ninth Circuit was 47 Metro-Goldwyn-Mayer Studios, Inc. v. unable to properly resolve this technology Grokster, Ltd., 380 F.3d 1154 (9th Cir. 2004). issue, even though it had already done so This decision was overturned a year later by once in the past. Face/Off: “DeepFake” Face Swaps and Privacy Laws 11

related issues. A recent example of decision revolved around the this came from conflicting decisions constitutional maxim that “when the regarding the Fourth Amendment Government physically invades and its applications to data stored personal property to gather on foreign servers, as the Second information, a search occurs.”50 Circuit and the Eastern District of However, “physical intrusion is now came to opposite unnecessary to many forms of conclusions regarding the United surveillance,” and therefore the States’ jurisdiction over this data.48 majority’s “approach is ill suited to Practically speaking, this means that the digital age.”51 This is just one for more complex issues, potential example. In cases where technology plaintiffs and defendants alike may is advancing rapidly, the judicial be left wondering exactly what the solution may not be practical as the law is, depending on the venue in traditional methods of dealing with which their case is proceeding. an issue may not be able to keep On top of circuit splits, courts pace with the problems created by can at times have difficulty grasping these advances. the implications of newer While there are arguments for technologies, and prior precedent allowing the courts to handle face may not provide an adequate swaps, Congress and state framework for evaluating new legislative bodies can also pass technologies. The Supreme Court proactive laws to deter abuses of the has grappled with these issues a technology and have a process number of times in the realm of already in place to deal with them if emerging technologies and the and when they do become a more Fourth Amendment. In US v. Jones, serious issue. Indeed, bills for example, the Court ruled that law regulating DeepFakes were enforcement officials must obtain a introduced into both the U.S. Senate warrant to track a private and House in 2019, although neither individual’s movements via Global have been enacted into federal law Positional System (GPS) devices.49 as of this writing. Multiple states, In her concurrence, Justice including Virginia and , have Sotomayor noted that the majority enacted laws addressing DeepFakes,

48 Lucy Bertino, Courts Continue to Split on around the globe. David Ruiz, Responsibility the Fourth Amendment in Cyberspace, NORTH Deflected, the CLOUD Act Passes, ELECTRONIC CAROLINA JOURNAL OF LAW & TECHNOLOGY (Feb. FRONTIER FOUNDATION (Mar. 22, 2018), 22, 2017), available at http://ncjolt. available at https://www.eff.org/deep org/circuit-split-4th-amendment- links/2018/03/responsibility-deflected- cyberspace/. This split was shortly cloud-act-passes. thereafter resolved by the CLOUD Act, 49 U.S. v. Jones, 565 U.S. 400 (2012). H.R.4943, which grants the 50 Id. at 414. and foreign parties access to data stored 51 Id. at 414, 417. 12 DEFENSE COUNSEL JOURNAL | JANUARY 2020

albeit in different contexts.52 As Amendment issues stemming from discussed briefly earlier, these can the nature of these types of media: target two different but equally the people being supposedly important groups: private video depicted in a DeepFake are not the creators and the websites hosting ones whose bodies are actually in the content themselves. the videos. 55 As discussed in the For individual video creators, previous section, there is some the easiest step is to simply modify question about this; certain types of existing laws to explicitly address claims may or may not be viable for face swaps and other fake media. plaintiffs victimized by face swap Unfortunately, one of the common media. uses for DeepFakes is the creation of A better approach, regardless of fake pornography. Many states the issue being addressed, might be already have “” to draft new laws altogether which statutes, which criminalize specifically regulate face swaps, publishing intimate media of regardless of who it depicts or for another person which they wish to which end purpose it is created. For remain private (or, as it is such a law to be effective, it may sometimes called, nonconsensual make more sense to adopt a strict pornography).53 In some states, liability standard for content these offenses are misdemeanors, in creators instead one that makes it a others, felonies. However, in most of crime to create any fake media these cases, the existing statutes do depicting an individual without not address face swaps.54 As these their consent. This would run into a statutes demonstrate, there are bevy of First Amendment issues, some basic legal elemental and First and ultimately may not stand if

52 Adi Robertson, Virginia’s ‘revenge porn’ address issues with wide-ranging effects is laws now officially cover deepfakes, THE very difficult, especially if the causes and/or VERGE, (July 1, 2019) available at effects are not yet well-understood by https://www.theverge.com/2019/7/1/206 lawmakers. 77800/virginia-revenge-porn-deepfakes- 53 Liz Crampton, Taking New Steps to Put an nonconsensual-photos-videos-ban-goes- End to "Revenge Porn", THE TEXAS TRIBUNE into-effect. Virginia’s updated law, (Aug. 21, 2015) available at addressing DeepFakes in the context of https://www.texastribune.org/2015/08/2 nonconsensual pornography, can be found 1/texas-law-criminalizing-revenge-porn- at VA. CODE. § 18.2-386.2 (2019). Texas’ law, goes-into-effect/. 38 states, as well as addressing DeepFakes in the context of Washington, DC, currently have revenge election law, can be found at TEX. ELECTION porn statutes in effect. 46 States + DC + One CODE § 255.004 (2019). Note that these two Territory have Revenge Porn Laws, CYBER exemplars differ dramatically, in both their CIVIL RIGHTS INITIATIVE (last accessed Dec. 10, approaches to the law and the problems 2019), available at https://www.cyber they seek to address. This is an excellent civilrights.org/revenge-porn-laws. demonstration of the shortfalls of legislative 54 Ellis, supra note 26. solutions: drafting comprehensive laws to 55 Id. Face/Off: “DeepFake” Face Swaps and Privacy Laws 13

challenged in court.56 However, if detect various prohibited content properly narrowly tailored, such an such as copyright violations, it approach would be the most would not be too complicated to add effective option to combat creators face swap detection as well to the of face swaps used for offensive bots which scan each and every post purposes. uploaded to their servers.59 Another option is to target and The final potential target of make liable platforms hosting legislation would be the software falsified media. These platforms are used to create these programs.60 By where the vast majority of falsified holding creators of these apps liable media would be shared and viewed. for their misuse, legislators would While the applications of such force the authors of these programs liability to the right of publicity have to severely restrict their been touched on already, explicit, distribution or monitor their use. broad laws opening up these This too is potentially problematic – companies to liability for hosting moviegoers, such as those who saw fake media would likely cause a and enjoyed Rogue One¸ would be sharp decline in the availability of the first to acknowledge the positive these videos. There are concerns, uses of face swap technology.61 It is, however: for media in theory, not fair to restrict the platforms based on content posted spread of powerful technology by unaffiliated users would simply out of fear of what essentially make them “arbiters of malefactors may do with it, and it truth,” a position of power in which may very well be unconstitutional, legislators and judges alike are due to the same First Amendment understandably wary to place concerns that arise from private companies.57 With that said, criminalizing face swaps in there are already tools – themselves general.62 As with pursuing content ironically utilizing machine learning hosts, there would likely be and artificial intelligence – available sweeping results. These would need to detect face swapped media.58 For to be balanced against a clearly content hosts like YouTube, which compelling governmental interest in already deploy AI algorithms to preventing American citizens from

56 Id. Picture Association of America stated that 57 Lempel, supra note 34. such a law would restrict the ability of 58 Ellis, supra note 26. filmmakers to depict real people and events. 59 Id. Robertson, supra note 52. 60 Id. 62 Damon Beres and Marcus Gilmer, A guide 61 Murphy, supra note 10. Indeed, in to 'deepfakes,' the internet's latest moral response to a proposed New York law which crisis, MASHABLE (Feb. 2, 2018) available at would criminalize creating “digital replicas” https://mashable.com/2018/02/02/what- of people without their consent, the Motion are-deepfakes. 14 DEFENSE COUNSEL JOURNAL | JANUARY 2020

being victimized by face swap from claims stemming from this software. Whether a law could be technology. narrowly tailored enough to overcome the First Amendment concerns would depend on both the legislative body attempting to do so and the court reviewing the resulting legislation, but it seems like a valid approach for lawmakers to attempt.

IV. CONCLUSION

Face swap software utilizing machine learning, whether FakeApp or otherwise, is a powerful and exciting tool, with valuable applications in entertainment and elsewhere. However, along with its potential for simple entertainment, such as ensuring that Nicholas Cage appears in every film ever created, there are considerable avenues for misuse of this technology as well. As the availability of the software has spread, so too have concerns resulting from its proliferation, especially videos exploiting the images of others. As the technology continues to improve, these issues will only continue to rise in profile and frequency. Ideally, legislatures and judiciary will attempt to confront these problems before they become major ones, but it is not a matter of if, but when face swaps will become too big an issue to ignore. And it is only a matter of time before attorneys are called upon to defend clients—whether content creators, content hosts, or others—