On March 21, 2023, deepfake images of federal agents “arresting” former president Donald Trump were posted on Twitter the same day Trump predicted his own arrest.[1] These fake photos were viewed by millions after a Twitter user asked an AI art generator to create a photo of “Donald Trump falling down while being arrested” as a joke.[2] But to internet viewers, these photos seem very real.

In the modern age of misinformation, deepfake technology threatens our perception of reality. Deepfake technology “use[s] a form of artificial intelligence called deep learning to make images of fake events.”[3] As the technology matures, it continues to get smarter. Researchers have developed programs to detect deepfakes, and in turn, the AI is "trained” by their programs to avoid detection and can create even more realistic images.[4] Deepfake is not limited to photographs: both video and audio are technologically mimicable as well.[5]

Although deepfakes are often produced as a joke, such as the current viral Tik Tok trend of AI-generated videos starring President Joe Biden as he plays video games with the likes of Barack Obama, Donald Trump, and Hillary Clinton,[6] deepfakes also harbor a more sinister purpose. According to a 2019 study conducted by Deep Trace Labs, 96% of all deepfake videos online are pornographic and 99% of deepfake pornography targets women.[7] Researchers report that the number of deepfake pornography videos available online has continued to increase exponentially, doubling nearly every year since 2018.[8] “Deepfake pornography is a phenomenon that exclusively targets and harms women,”[9] victimizing them by mapping women’s faces onto pornographic actors’ bodies.[10] Deepfake pornography is problematic because the victims targeted inherently cannot consent, as the media is synthetically generated. While deepfakers initially targeted celebrity women because of the expansive access to their likeness, the deepfake pornography sphere has grown to include even those with small online footprints as the technology has advanced.[11] In January, the popular Twitch streamer “Atrioc” was caught purchasing and viewing deepfake pornography of other Twitch streamers.[12] Not only did Atrioc monetarily support this invasive media, but by going viral, he gave the website and deepfake pornography a platform for engagement.

While deepfake technology is relatively new, women have been dealing with fake pornography for centuries. Before deepfake pornography, women were targeted with photoshop,[13] and before photoshop, women in the 1800’s were warned to only pose for photographs or paintings with men they trusted to avoid “an ungentlemanly photographer [who] could use the[ir likeness] to . . . mak[e] composites that featured the sitter’s head on a scantily dressed body – or worse.”[14] Although the notion of synthetic pornography is not novel, deepfake technology simply allows for the creation of more sophisticated fake pornography that continues to target women.

While women often bear the brunt of targeted deepfake attacks, on a widespread level, deepfakes also pose a threat to our social reality. Deepfake images circulating online can erode public trust in the validity of any media.[15]“When we enter this world where anything can be fake – any image, any audio, any video, any piece of text, nothing has to be real,” says Hany Farid, a professor at the University of California, Berkeley’s School of Information.[16] According to Nasir Memon, an NYU professor of computer science and engineering: “[a]s a consequence of this, even truth will not be believed. The man in front of the tank at Tiananmen Square moved the world. Nixon on the phone cost him his presidency. Images of horror from concentration camps finally moved us into action. If the notion of . . . believing what you see is under attack, that is a huge problem.”[17]

There have been multiple instances of deepfake media being used as an untruthful weapon: a high-school cheerleader’s mother was accused of deepfaking inappropriate media to disqualify a cheerleader on an opposing team;[18]a deepfake audio message posing as the CEO of their parent company convinced Euler Hermes Insurance Company to wire $243,000.00 to cybercriminals;[19] a mother in a custody battle created deepfake audio of her child’s father violently threatening her as courtroom sabotage.[20] Deepfaked media has created and will continue to create trust issues in the courtroom. “[E]ven in cases that do not involve fake videos, the very existence of deepfakes will complicate the task of authenticating real evidence.”[21] Our fundamental institutions, such as political elections or the courtroom, may be undermined by doubt instilled by deepfake media.

Scarily enough, deepfakes do not seem to be subject to specific legal regulation. Only four states have laws explicitly regulating deepfake technology: New York, Virginia, California, and Georgia.[22] This does not mean that a deepfake victim has zero recourse: a victim of deepfake technology can potentially bring defamation or harassment lawsuits against the privacy infringer.[23] But deepfakes are typically posted and consumed by anonymous users, making redressability difficult, if not impossible.[24] Without explicit deepfake laws, it is vital that internet users are aware of the existence of deepfake media and how to verify that what they are seeing is in fact, the truth.


[1] Zachary B. Wolf, Beware Deepfake Reality as Trump Dominates Headlines, CNN (Mar. 25, 2023, 7:49 PM), https://www.cnn.com/2023/03/25/politics/ai-trump-what-matters [https://perma.cc/8S6H-SJN6].

[2] Id.

[3] Ian Sample, What are Deepfakes – and How Can You Spot Them?, The Guardian (Jan. 13, 2020, 5:00 AM), https://www.theguardian.com/technology/2020/jan/13/what-are-deepfakes-and-how-can-you-spot-them [https://perma.cc/EY5A-6KGS].

[4] Frederick Dauer, Law Enforcement in the Era of Deepfakes, Police Chief Mag. (June 22, 2022), https://www.policechiefmagazine.org/law-enforcement-era-deepfakes/#:~:text=Due%20to%20a%20lack%20of,election%20and%20nonconsensual%20deepfake%20pornography [https://perma.cc/UK8E-QYZ2].

[5] Sample, supra note 4.

[6] Allegra Rosenberg, AI-Generated Audio of Joe Biden and Donald Trump Trashtalking While Gaming Is Taking over TikTok, Bus. Insider (Mar. 1, 2023, 1:06 PM), https://www.businessinsider.com/voice-ai-audio-joe-biden-donald-trump-tiktok-2023-3 [https://perma.cc/448W-H3NN].

[7] Henry Adjer et al., The State of Deepfakes, Deep Trace Labs 1 (Sept. 2019).

[8] Bianca Britton, They Appeared in Deepfake Porn Videos Without Their Consent. Few Laws Protect Them., NBC News (Feb. 14, 2023, 2:48 PM) https://www.nbcnews.com/tech/internet/deepfake-twitch-porn-atrioc-qtcinderella-maya-higa-pokimane-rcna69372 [https://perma.cc/H6HF-BZMP].

[9] Adjer et al., supra note 8, at 2.

[10] Id; Riana Pfefferkorn, Deepfakes in the Courtroom, 29 Pub. Int. L. J. 245, 252 (Sept. 2019).

[11] Britton, supra note 9.

[12] Id.

[13] Ruby Harris, How it Feels to Find Your Face Photoshopped onto Internet Porn, Vice (Apr. 17, 2019, 7:46 PM), https://www.vice.com/en/article/gy4p47/how-it-feels-to-find-your-face-photoshopped-onto-internet-porn [https://perma.cc/LKP9-DR4Q].

[14] Lynn Berger, Photography Distinguishes Itself: Law and the Emerging Profession of

Photography in the Nineteenth-Century United States 248 (Feb. 26, 2016) (unpublished

Ph.D. dissertation, Columbia University) (on file with the Columbia University Library system).

[15] Dauer, supra note 5.

[16] Wolf, supra note 2.

[17] William A. Galston, Is Seeing Still Believing? The Deepfake Challenge to Truth in Politics (Jan. 8, 2020), https://www.brookings.edu/research/is-seeing-still-believing-the-deepfake-challenge-to-truth-in-politics [https://perma.cc/VV5R-QU74].

[18] Dauer, supra note 5.

[19] Adjer et al, supra note 8, at 14.

[20] Patrick Ryan, 'Deepfake' Audio Evidence Used in UK Court to Discredit Dubai Dad, NUAE (Feb. 8, 2020), https://www.thenationalnews.com/uae/courts/deepfake-audio-evidence-used-in-uk-court-to-discredit-dubai-dad-1.975764 [https://perma.cc/E2F3-9PTJ].

[21] Pfefferkorn, supra note 11, at 255.

[22] Britton, supra note 9.

[23] Will Walker, Mission Impossible?: The Legal Implications of Managing Deepfake Celebrity Videos, Harv. J. Sports Ent. L. (Mar. 24, 2021), https://harvardjsel.com/2021/03/mission-impossible-the-legal-implications-of-managing-deepfake-celebrity-videos [https://perma.cc/FSD8-NS3Q].

[24] Id.

Published:
Monday, April 3, 2023