When a fake porn video purporting to depict Gal Gadot having sex with her stepbrother surfaced online in December 2017, the reaction was swift and immediate. Vice—the outlet that first reported on the video—was quick to highlight the way that face-swap technology could be used to manufacture a wholly new form of “revenge porn,” one in which victims could find themselves featured in explicit sexual media without ever taking off their clothes in front of a camera.
Two years later, a number of powerful entities have taken steps to push back against the threat posed by deceptive videos, especially in the realm of politics. Facebook announced a ban on deepfakes last week, while Congress has lately earmarked millions of dollars for the development of technology to detect them. But the fear of sexual deepfakes still looms large in conversations about online harassment. Politicians, celebrities, even regular, unfamous women with vindictive ex-boyfriends have all been held up as potential victims of this high-tech form of abuse. Less discussed, however, is the impact deepfakes will have on a wholly different group of women: the actors whose bodies and sexual performances are used as the basis for manufactured pornography.
Even as deepfake technology has gotten significantly more convincing, it’s still rare to see a video that’s been created completely from scratch. In most cases, deepfake creators are just grafting one person’s face onto another person’s body—rendering, for instance, the porn performer Pepper XO into the movie star Gal Gadot.
Porn performers whose scenes are manipulated in this fashion aren’t the direct subjects of this abuse. Most of the time, they’ll never even know about the deepfakes that were based off of their work. But that doesn’t reduce the sense of violation that many industry members feel when confronted with the phenomenon—or the fears they have about how this technology could be used to harass and abuse them, as well.
Sex worker Sydney Leathers hasn’t come across any deepfakes based on her porn scenes, but she is intimately familiar with what it feels like to have her work used to harass and debase another woman. In January 2019, a bathtub selfie that Leathers had posted to Instagram began circulating online, repackaged as a naked photo of newly elected congressmember Alexandria Ocasio-Cortez.
For Leathers, who only heard about the photo once it had been debunked and correctly identified in the media, the experience was deeply upsetting. It felt, she says, “a little violating,” particularly since Ocasio-Cortez is a politician she likes and respects. “You don’t want to feel like you’re a part of [the harassment of other women],” Leathers continues. But when someone repackages your porn scene as a tool of abuse, you don’t get a choice in the matter.
The idea of being weaponized as a deepfake feels “more objectifying than any of the made up reasons why people who hate porn think it’s objectifying,” says Ela Darling, an adult film performer and director of marketing at ViRo Club, an erotic VR platform. “That would literally make me into a tool they’re using to harm someone else. I’d feel eviscerated.” The knowledge that she’d play no active role in the creation of the deepfake doesn’t provide Darling with much comfort. “I would feel used,” she says, describing the idea as “skin-crawlingly gross.”
Working as a porn performer already means getting accustomed to having your image repackaged and distributed in ways that you’re not fully comfortable with. Numerous performers have had the experience of seeing a scene that was shot in one context appear online with a completely different framing. A loving scene involving a black performer and a white performer may wind up being advertised with racist keywords and descriptions; a performer in her mid-twenties may be shocked to see herself labeled as a “MILF” or “cougar” when a scene lands online.
As piracy has become rampant over the past 10 years or so, that effect has become even worse. You may trust the production companies you shoot with to respect your boundaries, but once their content has been downloaded and reuploaded across the internet, there’s no telling what language will be used to advertise your work—or how a scene you may have created will be cut up, edited, and utterly divorced from your original understanding of what you were creating.
With deepfake technology, the same dynamic gets taken to a horrifying new level. Instead of merely worrying that your own image will be distributed in a way that feels off-brand or even offensive, performers must now consider the possibility that their bodies will be cut up into parts and reassembled, Frankenstein-style, into a video intended to harass and humiliate someone who never consented to be sexualized in this manner. It is, Leathers tells me, hardly the first thing performers think about when they’re getting ready to shoot a scene. Porn performers aren’t accountable for the abusive campaigns for which their work gets hijacked but as these violations become more common, the discomfort of seeing their output transformed into harassment will become yet another occupational hazard.
Porn performers have already begun to think about these hazards in the context of another new technology. A handful of adult companies have begun exploring ways to scan performers’ bodies and store them as a digital avatars, ones that would then be capable of being molded into any conceivable position. The current products don’t quite pass for real performers: Camasutra’s VR adventures looks like high-end CGI, and performer Tori Black compared her Holodexxx avatar to a “wax doll.” But it’s not farfetched to imagine that, as technology continues to improve, these avatars might one day be indistinguishable from real-life actors. And worries like those elicited by deepfake videos have already started to emerge: According to Darling, one company’s demo went awry after a performer couple who’d agreed to be scanned were horrified to discover that their digital avatars could be paired off with anyone. In real life, they were adamant about only performing with each other.
Whether these sorts of manipulations are done via avatar or deepfakes, being forced into a scene with an unwanted partner represents the milder end of potential harms. Darling worries that constructed scenarios could also be used in ways that negatively impact the real lives, and livelihoods, of the performers they are based on. A performer who’s been holding out on doing an anal scene until she gets offered a rate she feels comfortable with could find herself scooped by her digital counterpart; a performer who is morally opposed to performing in scenes with racist themes might find that her swapped face or avatar does not share those same compunctions.
At ViRo Club, Darling has advocated for a content policy that would give complete control over how their avatars are used, and she hopes that the rest of the industry will follow that model. But even if this were an industry-wide policy, there’s no reason to think that the deepfakes crowd would feel similarly obligated. Worse, if porn performers’ poseable nude bodies could be successfully combined with face-swap technology, performers will have to contend with the possibility that their bodies will become headless marionettes acting out any scenario whatsoever, involving anyone.
We’re not at the point where that’s a real possibility, and it may be a while before creating that kind of content is a trivial matter—if it ever happens at all. Still, one can reasonably assume that these manipulative technologies will improve in the months and years to come, and that we’ll only encounter more and more opportunities for abuse and harassment. It’s natural to focus on the harms that will be done to the direct subjects of these digital creations, and easy to think that the major violation is that of being rendered publicly nude and sexualized without consent. But we ought to expand our lens beyond that obvious abuse to comprehend the full extent of harm from deepfakes—to understand that someone’s unwanted sexualization and exposure is only part of the story. These videos also have the effect of turning performers into digital puppets, manipulated without any concern for their humanity and dignity.