On March 21, 2023, deepfake images of federal agents “arresting” former president Donald Trump were posted on Twitter the same day Trump predicted his own arrest. These fake photos were viewed by millions after a Twitter user asked an AI art generator to create a photo of “Donald Trump falling down while being arrested” as a joke. But to internet viewers, these photos seem very real.
In the modern age of misinformation, deepfake technology threatens our perception of reality. Deepfake technology “use[s] a form of artificial intelligence called deep learning to make images of fake events.” As the technology matures, it continues to get smarter. Researchers have developed programs to detect deepfakes, and in turn, the AI is "trained” by their programs to avoid detection and can create even more realistic images. Deepfake is not limited to photographs: both video and audio are technologically mimicable as well.
Although deepfakes are often produced as a joke, such as the current viral Tik Tok trend of AI-generated videos starring President Joe Biden as he plays video games with the likes of Barack Obama, Donald Trump, and Hillary Clinton, deepfakes also harbor a more sinister purpose. According to a 2019 study conducted by Deep Trace Labs, 96% of all deepfake videos online are pornographic and 99% of deepfake pornography targets women. Researchers report that the number of deepfake pornography videos available online has continued to increase exponentially, doubling nearly every year since 2018. “Deepfake pornography is a phenomenon that exclusively targets and harms women,” victimizing them by mapping women’s faces onto pornographic actors’ bodies. Deepfake pornography is problematic because the victims targeted inherently cannot consent, as the media is synthetically generated. While deepfakers initially targeted celebrity women because of the expansive access to their likeness, the deepfake pornography sphere has grown to include even those with small online footprints as the technology has advanced. In January, the popular Twitch streamer “Atrioc” was caught purchasing and viewing deepfake pornography of other Twitch streamers. Not only did Atrioc monetarily support this invasive media, but by going viral, he gave the website and deepfake pornography a platform for engagement.
While deepfake technology is relatively new, women have been dealing with fake pornography for centuries. Before deepfake pornography, women were targeted with photoshop, and before photoshop, women in the 1800’s were warned to only pose for photographs or paintings with men they trusted to avoid “an ungentlemanly photographer [who] could use the[ir likeness] to . . . mak[e] composites that featured the sitter’s head on a scantily dressed body – or worse.” Although the notion of synthetic pornography is not novel, deepfake technology simply allows for the creation of more sophisticated fake pornography that continues to target women.
While women often bear the brunt of targeted deepfake attacks, on a widespread level, deepfakes also pose a threat to our social reality. Deepfake images circulating online can erode public trust in the validity of any media.“When we enter this world where anything can be fake – any image, any audio, any video, any piece of text, nothing has to be real,” says Hany Farid, a professor at the University of California, Berkeley’s School of Information. According to Nasir Memon, an NYU professor of computer science and engineering: “[a]s a consequence of this, even truth will not be believed. The man in front of the tank at Tiananmen Square moved the world. Nixon on the phone cost him his presidency. Images of horror from concentration camps finally moved us into action. If the notion of . . . believing what you see is under attack, that is a huge problem.”
There have been multiple instances of deepfake media being used as an untruthful weapon: a high-school cheerleader’s mother was accused of deepfaking inappropriate media to disqualify a cheerleader on an opposing team;a deepfake audio message posing as the CEO of their parent company convinced Euler Hermes Insurance Company to wire $243,000.00 to cybercriminals; a mother in a custody battle created deepfake audio of her child’s father violently threatening her as courtroom sabotage. Deepfaked media has created and will continue to create trust issues in the courtroom. “[E]ven in cases that do not involve fake videos, the very existence of deepfakes will complicate the task of authenticating real evidence.” Our fundamental institutions, such as political elections or the courtroom, may be undermined by doubt instilled by deepfake media.
Scarily enough, deepfakes do not seem to be subject to specific legal regulation. Only four states have laws explicitly regulating deepfake technology: New York, Virginia, California, and Georgia. This does not mean that a deepfake victim has zero recourse: a victim of deepfake technology can potentially bring defamation or harassment lawsuits against the privacy infringer. But deepfakes are typically posted and consumed by anonymous users, making redressability difficult, if not impossible. Without explicit deepfake laws, it is vital that internet users are aware of the existence of deepfake media and how to verify that what they are seeing is in fact, the truth.
 Zachary B. Wolf, Beware Deepfake Reality as Trump Dominates Headlines, CNN (Mar. 25, 2023, 7:49 PM), https://www.cnn.com/2023/03/25/politics/ai-trump-what-matters [https://perma.cc/8S6H-SJN6].
 Ian Sample, What are Deepfakes – and How Can You Spot Them?, The Guardian (Jan. 13, 2020, 5:00 AM), https://www.theguardian.com/technology/2020/jan/13/what-are-deepfakes-and-how-can-you-spot-them [https://perma.cc/EY5A-6KGS].
 Frederick Dauer, Law Enforcement in the Era of Deepfakes, Police Chief Mag. (June 22, 2022), https://www.policechiefmagazine.org/law-enforcement-era-deepfakes/#:~:text=Due%20to%20a%20lack%20of,election%20and%20nonconsensual%20deepfake%20pornography [https://perma.cc/UK8E-QYZ2].
 Sample, supra note 4.
 Allegra Rosenberg, AI-Generated Audio of Joe Biden and Donald Trump Trashtalking While Gaming Is Taking over TikTok, Bus. Insider (Mar. 1, 2023, 1:06 PM), https://www.businessinsider.com/voice-ai-audio-joe-biden-donald-trump-tiktok-2023-3 [https://perma.cc/448W-H3NN].
 Henry Adjer et al., The State of Deepfakes, Deep Trace Labs 1 (Sept. 2019).
 Bianca Britton, They Appeared in Deepfake Porn Videos Without Their Consent. Few Laws Protect Them., NBC News (Feb. 14, 2023, 2:48 PM) https://www.nbcnews.com/tech/internet/deepfake-twitch-porn-atrioc-qtcinderella-maya-higa-pokimane-rcna69372 [https://perma.cc/H6HF-BZMP].
 Adjer et al., supra note 8, at 2.
 Id; Riana Pfefferkorn, Deepfakes in the Courtroom, 29 Pub. Int. L. J. 245, 252 (Sept. 2019).
 Britton, supra note 9.
 Ruby Harris, How it Feels to Find Your Face Photoshopped onto Internet Porn, Vice (Apr. 17, 2019, 7:46 PM), https://www.vice.com/en/article/gy4p47/how-it-feels-to-find-your-face-photoshopped-onto-internet-porn [https://perma.cc/LKP9-DR4Q].
 Lynn Berger, Photography Distinguishes Itself: Law and the Emerging Profession of
Photography in the Nineteenth-Century United States 248 (Feb. 26, 2016) (unpublished
Ph.D. dissertation, Columbia University) (on file with the Columbia University Library system).
 Dauer, supra note 5.
 Wolf, supra note 2.
 William A. Galston, Is Seeing Still Believing? The Deepfake Challenge to Truth in Politics (Jan. 8, 2020), https://www.brookings.edu/research/is-seeing-still-believing-the-deepfake-challenge-to-truth-in-politics [https://perma.cc/VV5R-QU74].
 Dauer, supra note 5.
 Adjer et al, supra note 8, at 14.
 Patrick Ryan, 'Deepfake' Audio Evidence Used in UK Court to Discredit Dubai Dad, NUAE (Feb. 8, 2020), https://www.thenationalnews.com/uae/courts/deepfake-audio-evidence-used-in-uk-court-to-discredit-dubai-dad-1.975764 [https://perma.cc/E2F3-9PTJ].
 Pfefferkorn, supra note 11, at 255.
 Britton, supra note 9.
 Will Walker, Mission Impossible?: The Legal Implications of Managing Deepfake Celebrity Videos, Harv. J. Sports Ent. L. (Mar. 24, 2021), https://harvardjsel.com/2021/03/mission-impossible-the-legal-implications-of-managing-deepfake-celebrity-videos [https://perma.cc/FSD8-NS3Q].