Behind the Faces: The Untold Exploitation of Bodies in Deepfake Porn

By

Introduction

When Jennifer ran her professional headshot through a facial recognition program in 2023, she expected to find old porn videos from her early 20s. What she didn't expect was a deepfake—one of her videos altered with a different face. The discovery shattered her sense of privacy and control over her own image.

Behind the Faces: The Untold Exploitation of Bodies in Deepfake Porn
Source: www.technologyreview.com

“At first, I thought it was just a different person,” says Jennifer, now a psychotherapist in New York City, who uses a pseudonym for safety. But then she recognized the garish background from a 2013 video. “Somebody used me in a deepfake.”

The facial recognition tech identified her because traces of her features remained—cheekbones, brow, chin. “It’s like I’m wearing somebody else’s face like a mask,” she explains.

The Forgotten Victims: Bodies in Deepfake Porn

Conversations about sexualized deepfakes, a form of nonconsensual intimate imagery (NCII), often focus on the faces superimposed onto pornographic content—celebrities like Scarlett Johansson, or increasingly, private individuals. Yet rarely is there any discussion about whose bodies are being used. As Jennifer asks, “There’s never any discussion about Whose body is this?”

Adult Content Creators as the Default Source

For years, the answer has largely been adult content creators. Deepfakes earned their name in November 2017 when a Reddit user with the handle “deepfakes” posted videos of celebrities’ faces pasted onto porn actors’ bodies. Attorney Corey Silverstein, who specializes in adult industry law, says this nonconsensual use “happens all the time.”

But the exploitation goes beyond isolated uploads. Performers’ bodies are being systematically harvested to train generative AI models, which then create entirely new—yet eerily realistic—bodies and movements. This transforms performers into unwitting training data.

The Shift from Direct Theft to Data Mining

In the past, deepfake creators might directly lift a porn actor’s body from an existing video and attach a different face. Today, advancements in generative AI have made it possible to absorb thousands of bodies into a model, then generate synthetic nudes that bear no single identifiable source—yet are built on the unpaid labor of adult performers.

“It’s not just about one stolen video anymore,” says Silverstein. “It’s about an entire archive of work being used without permission to create new content.” This threatens performers’ livelihoods, as AI-generated nudes can be produced faster, cheaper, and in infinite variety, potentially undercutting their market.

Behind the Faces: The Untold Exploitation of Bodies in Deepfake Porn
Source: www.technologyreview.com

The Role of Nudify Apps

So-called “nudify” apps have exacerbated the problem. These tools allow users to upload a photo of a clothed person and generate a nude version, often trained on real adult content. The apps are marketed as harmless fun, but they strip consent at every level—both the face and the body are exploited without permission.

Implications for Performers’ Rights and Future

Adult content creators are now facing a dual threat: their own images are used as raw material for deepfakes, and the AI trained on their work can generate content that competes directly with them. “It’s like having your body cloned and sold without your permission,” says Jennifer.

Legal frameworks are struggling to keep up. While some states have passed laws against nonconsensual deepfakes, these often focus on the face—the victim whose identity is stolen—rather than the body that provides the intimate performance. Performers have little recourse when their bodies become invisible training data.

A Call for Inclusive Legislation

“We need laws that protect everyone involved in the creation of these images,” Silverstein argues. That means acknowledging adult content creators as victims of NCII when their bodies are used without consent, and requiring clear attribution or compensation for any work used in AI training sets.

Conclusion

Jennifer’s story is a stark reminder that deepfake exploitation is not just about celebrities or private individuals. It is also about the thousands of porn actors whose bodies are commodified, anonymized, and invisibilized in the rush to create realistic AI pornography. As technology marches forward, society must broaden its understanding of consent—and ensure that every person, whether contributing their face or their body, has the right to say no.

— This article has been rewritten from original reporting for clarity and depth.

Tags:

Related Articles

Recommended

Discover More

Mineru's Giant Construct Gets an Official Amiibo for Zelda: Tears of the KingdomChina-Linked Silver Fox Group Deploys ABCDoor Malware in Tax-Themed Phishing Blitz on India and RussiaWorld Models Named Critical AI Breakthrough as Industry Races to Understand Reality7 Ways Docker and Mend.io Revolutionize Container Security for DevelopersBoosting Token Efficiency in GitHub Agentic Workflows: Key Strategies and Insights