Skip to Content, Navigation, or Footer.
34th Street Magazine - Return Home

Focus

False Bodies Online

The internet was once incorporeal. Now, the rise of artificial intelligence and deepfakes is generating new forms of cybersexual violence and threatening bodily autonomy online.

deepfake.png

Being online used to be a form of escape from the physical body. In chatrooms and early virtual worlds,
identities could be flattened into usernames and molded into customizable avatars. For most people, it was
a space free from the physical conditions that govern the world and our bodies.

“There was this idea among some of the early internet vanguards that the internet would be a disembodied place … a place where it didn’t matter what your body was,” says Sophie Maddocks, a researcher of cybersexual violence at the Annenberg School for Communication. “Once you were on the internet, you would kind of be able to create a new body or create a new digital representation of yourself, and your physical body didn’t matter.”

Early online worlds, like multi–user dungeons and multi–user shared hallucinations, were mostly text–based, allowing users to interact in real time through words and descriptions without the need for images. The primeval internet was supposed to be a great equalizer, taking strides to democratize information and act as an incorporeal escape from the physical world.

But in 1993, the first cybersexual assault occurred on one of these multi–user platforms. Described in Julian Dibbell’s article “A Rape in Cyberspace,” it involved one user’s avatar sexually violating others with a hack that forced them to perform disturbing sexual acts on themselves and others. Suddenly, the issue of the body and its autonomy, once thought to be limited to the physical world, arose online.

As technology has advanced, so has the ability to commit acts of violence in cyberspace. The advent of web 2.0, a more interactive, social network–based version of the internet, was quickly followed by a sharp increase in cyberbullying, stalking, and other forms of online harassment and abuse. Revenge porn has existed as long as photography has, but it has become increasingly common with improvements in image sharing. Now, with only a few brief commands, deepfake technology allows users to undress and create pornographic images of anyone whose photos exist online.

Deepfakes trace their origins back to 2017 on Reddit, when users began using face–swapping technology to create pornographic media of celebrities. They soon infiltrated most spheres of the internet, and in recent years have played significant roles in politics (think, for example, of President Donald Trump’s (W ’68) prolific use of artificial intelligence–generated content on his X feed). Scams, propaganda and data manipulation have been present throughout history, but never before have they been so accurate or accessible.

 In 2019, “nudify” apps emerged: Users could simply insert a photograph of a real person, and the software would instantly undress them. Individuals of any gender identity became targets. In fact, sextortion—the generation of pornography for use as blackmail—disproportionately affects teenage boys. But the vast majority of deepfakes are used to undress the female body: A 2023 study found that pornography constitutes 98% of deepfake videos, 99% of the individuals victimized were women, and 74% of deepfake pornography users reported no guilt about their use. Many of these women are well–known public figures, but many of them are not. Increasingly, the victims of deepfaking are classmates, neighbors, and coworkers.

“It’s not that these things don’t happen to people across the gender spectrum,” says Maddocks. “It’s just that they aren’t driven by the same desire to shame and humiliate, and to profit and to exploit that has always been a factor for feminized bodies.”

In many ways, misogyny and the surveillance of female bodies have been intrinsic to the internet and social media since their conception. One of the first photos used to test digital image processing was Lena, a photo of Swedish model Lena Forsén. The picture was ripped out of the centerfold of a Playboy magazine and put onto a scanner to be immortally digitized. Many are already familiar with Facebook’s beginnings as Facemash, a website Mark Zuckerberg built in his Harvard dorm that asked users to rank his female classmates by attractiveness. Sites like Reddit and 4chan have facilitated a renaissance of incel culture, which today permeates much of the internet and big tech. Now, that misogyny stands to do even more damage with deepfake technology at its disposal.

“The capacity to strip feminized bodies, without actually filming someone secretly, without actually stealing someone’s nude images … It’s almost just like giving them an extra tool in their arsenal,” says Maddocks. “[It’s] giving them a really easy, really fast, free, simple way to demean and shame and silence women.”

Cybersexual harassment and abuse, while sharing some similarities with their physical counterparts, are separate issues and often require distinct solutions to address. Compared with physical reality, users in cyberspace have a greater ability to remain anonymous—moreover, images and messages online can be widely and rapidly shared, making them almost impossible to remove once posted.

“We know [from] therapists who work with individuals who’ve experienced cybersexual harms [that they] have a different type of ongoing PTSD,” explains Maddocks. “[If] somebody creates a fake nude image of you, it’s impossible to fully scrub that from the internet. There’s always a chance it could be reposted or screenshotted, and so the harm never ends.”

Victims of deepfaking often experience significant psychological symptoms, from anxiety and insomnia to hypervigilance and forms of reality–identity dissonance. The experience of seeing your likeness fabricated and made a victim of sexual exploitation in this way has the potential to rupture your sense of self, reality, and
bodily autonomy.

Deepfakes are also a concern for online sex work: Sites like OnlyFans are being inundated by AI “models,” taking business away from real sex workers. Those creating deepfakes, mainly men, are not only exploiting bodies unconsensually, but also monetizing and profiting off of them. “The internet is a deeply unregulated place. It’s decentralized and rapidly evolving, making it difficult for governance to keep up. That doesn’t mean it’s impossible, though,” says Maddocks. “There’s this dominant idea that the internet’s just this hella toxic place, and like, ‘Good luck surviving online,’” she says. “I just don’t think that’s really realistic. I think if we had the will and if it was incentivized, there would be more done.”

AI is testing our ability to differentiate between what’s real and what’s been generated in a way many are unprepared for. Deepfakes represent a sinister and unregulated threat to safety and reputations, but also yet another dangerous degree of separation from reality. The internet, which used to be a reprieve from the
physical world, is now trying to replicate it. What it’s become, however, is a reality completely under the control of those programming it—one in which the bodies of real people are crushed under the power of code.



More like this