{"id":22869,"date":"2026-05-14T09:26:35","date_gmt":"2026-05-14T09:26:35","guid":{"rendered":"https:\/\/ideainthebox.com\/index.php\/2026\/05\/14\/ai-porn-nonconsensual-deepfakes-takedown-piracy-copyright\/"},"modified":"2026-05-14T09:26:35","modified_gmt":"2026-05-14T09:26:35","slug":"ai-porn-nonconsensual-deepfakes-takedown-piracy-copyright","status":"publish","type":"post","link":"https:\/\/ideainthebox.com\/index.php\/2026\/05\/14\/ai-porn-nonconsensual-deepfakes-takedown-piracy-copyright\/","title":{"rendered":"The shock of seeing your body used in deepfake porn\u00a0"},"content":{"rendered":"<div>\n<p>When Jennifer got a job doing research for a nonprofit in 2023, she ran her new professional headshot through a facial recognition program. She wanted to see if the tech would pull up the porn videos she\u2019d made more than 10 years before, when she was in her early 20s. It did in fact return some of that content, and also something alarming that she\u2019d never seen before: one of her old videos, but with someone else\u2019s face on her body.<\/p>\n<div class=\"wp-block-group is-layout-constrained wp-block-group-is-layout-constrained\">\n<p>\u201cAt first, I thought it was just a different person,\u201d says Jennifer, who is being identified by a pseudonym to protect her privacy.\u00a0<\/p>\n<p>But then she recognized a distinctly garish background from a video she\u2019d shot around 2013, and she realized: \u201cSomebody used me in a deepfake.\u201d<\/p>\n<p>Eerily, the facial recognition tech had identified her because the image still contained some of Jennifer\u2019s features\u2014her cheekbones, her brow, the shape of her chin. \u201cIt\u2019s like I\u2019m wearing somebody else\u2019s face like a mask,\u201d she says.\u00a0<\/p>\n<\/div>\n<figure class=\"wp-block-pullquote alignright\">\n<blockquote>\n<p>\u201cIt\u2019s like I\u2019m wearing somebody else\u2019s face like a mask.\u201d<\/p>\n<\/blockquote>\n<\/figure>\n<p>Conversations about sexualized deepfakes\u2014which fall under the umbrella of nonconsensual intimate imagery, or NCII\u2014most often center on the people whose <em>faces<\/em> are featured doing something they didn\u2019t really do or on bodies that aren\u2019t really theirs. These are often popular celebrities, though over the past few years more people (<a href=\"https:\/\/www.technologyreview.com\/2021\/02\/12\/1018222\/deepfake-revenge-porn-coming-ban\/\">mostly women<\/a> and sometimes <a href=\"https:\/\/www.technologyreview.com\/2023\/12\/01\/1084164\/deepfake-porn-scandal-pushing-us-lawmakers\/\">youths<\/a>) have been targeted, sparking alarm, fear, and even legislation. But these discussions and societal responses usually are not concerned with the <em>bodies<\/em> the faces are attached to in these images and videos.<\/p>\n<p>As Jennifer, now 37 and a psychotherapist working in New York City, says: \u201cThere\u2019s never any discussion about <em>Whose body is this<\/em>?\u201d\u00a0<\/p>\n<p>For years, the <a href=\"https:\/\/www.technologyreview.com\/2021\/09\/13\/1035449\/ai-deepfake-app-face-swaps-women-into-porn\/\">answer<\/a> <a href=\"https:\/\/www.technologyreview.com\/2021\/02\/12\/1018222\/deepfake-revenge-porn-coming-ban\/\">has<\/a> generally been adult content creators. Deepfakes in fact <a href=\"https:\/\/datasociety.net\/wp-content\/uploads\/2019\/09\/DS_Deepfakes_Cheap_FakesFinal-1-1.pdf\">earned their name<\/a> back in November 2017, when someone with the Reddit username \u201cdeepfakes\u201d uploaded videos showing faces of stars like Scarlett Johansson and Gal Gadot pasted onto porn actors\u2019 bodies. The nonconsensual use of their bodies \u201chappens all the time\u201d in deepfakes, says Corey Silverstein, an attorney specializing in the adult industry.\u00a0<\/p>\n<p>But more recently, as generative AI has improved, and as \u201cnudify\u201d <a href=\"https:\/\/www.technologyreview.com\/2022\/12\/12\/1064751\/the-viral-ai-avatar-app-lensa-undressed-me-without-my-consent\/?truid=*%7CLINKID%7C*&amp;utm_source=the_download&amp;utm_medium=email&amp;utm_campaign=the_download.unpaid.engagement&amp;utm_term=*%7CSUBCLASS%7C*&amp;utm_content=*%7CDATE:m-d-Y%7C*\">apps<\/a> have begun to proliferate, the issue has grown far more complicated\u2014and, arguably, more dangerous for creators\u2019 futures.\u00a0<\/p>\n<p>Porn actors\u2019 bodies aren\u2019t necessarily being taken directly from sexual images and videos anymore, or at least not in an identifiable way. Instead, they are <a href=\"https:\/\/www.technologyreview.com\/2022\/12\/12\/1064751\/the-viral-ai-avatar-app-lensa-undressed-me-without-my-consent\/\">inevitably being used<\/a> as <a href=\"https:\/\/hai.stanford.edu\/policy\/addressing-ai-generated-child-sexual-abuse-material-opportunities-for-educational-policy\">training data<\/a> to inform how new AI-generated bodies look, move, and perform. This threatens the livelihood and rights of porn actors as their work is used to train AI nudes that in turn could take away their business. And that\u2019s not all: Advancements in AI have also made it possible for people to wholly re-create these performers\u2019 likenesses without their consent, and the AI copycats may do things the performers wouldn\u2019t do in real life. This could mean their digital doubles are participating in certain sex acts that they haven\u2019t agreed to do, or even that they\u2019re perpetrating scams against fans.\u00a0<\/p>\n<p>Adult content creators are already marginalized by a society that largely fails to protect their safety and rights, and these developments put them in an even more vulnerable position. After Jennifer found the deepfake featuring her body, she posted on social media about the psychological effects: \u201cI\u2019ve never seen anyone ask whether that might be traumatic for the person whose body was used without consent too. IT IS!\u201d Several other creators I spoke with shared the mental toll that comes with knowing their bodies have been used nonconsensually, as well as the fear that they\u2019ll suffer financially as other people pirate their work. Silverstein says he hears from adult actors every day who \u201care concerned that their content is being exploited via AI, and they\u2019re trying to figure out how to protect it.\u201d\u00a0<\/p>\n<p>One law professor and expert in violence against women calls these creators the \u201cforgotten victims\u201d of NCII deepfakes. And several of the people I spoke with worry that as the US develops a legal framework to combat nonconsensual sexual content online, adult actors are only at risk of further injury; instead of helping them, the crackdown on deepfakes may provide a loophole through which their content and careers could be stripped from the internet altogether.<\/p>\n<h3 class=\"wp-block-heading\">How deepfakes cause \u201cembodied harms\u201d<\/h3>\n<p>During his preteen years in the 1970s, Spike Irons, now a porn actor and president of the adult content platform XChatFans, was \u201cin love\u201d with Farrah Fawcett. Though Fawcett did not pose nude, Jones managed to get his hands on what looked like pictures of her naked. \u201cPeople were cutting out faces and pasting them on bodies,\u201d Irons says. \u201cDeepfakes, before AI, had been going around for quite a while. They just weren\u2019t as prolific.\u201d<\/p>\n<p>The early public internet was rife with websites capitalizing on the idea that you could use technology to \u201csee\u201d celebrities naked. \u201cPeople would just use Microsoft Paint,\u201d says Silverstein, the attorney. It was a simple way to mash up celebrities\u2019 faces with porn.\u00a0<\/p>\n<p>People later used software like Adobe After Effects or FakeApp, which was designed to <a href=\"https:\/\/medium.com\/@johnfakersnorth\/fixing-fakeapp-cc04c1d4eb13\">swap two individuals\u2019 faces<\/a> in images or videos. None of these programs required serious expertise to alter content, so there was a low barrier to entry. That, plus the wealth of porn performers\u2019 videos online, helped make face-swap deepfakes that used real bodies prevalent by the 2010s. When, later in the decade, deepfakes of <a href=\"https:\/\/www.vice.com\/en\/article\/deepfake-videos-like-that-gal-gadot-porn-are-only-getting-more-convincing-and-more-dangerous\/\">Gal Gadot<\/a> and <a href=\"https:\/\/www.bbc.com\/bbcthree\/article\/779c940c-c6c3-4d6b-9104-bef9459cc8bd\">Emma Watson<\/a> caused something of a broader panic, their faces were allegedly swapped onto the bodies of the porn actors <a href=\"https:\/\/www.wired.com\/story\/deepfake-porn-harms-adult-performers-too\/\">Pepper XO<\/a> and <a href=\"https:\/\/x.com\/missmarymoody\/status\/2024708860905476400?s=46&amp;t=IKyaznX8LG4sV4E1oHKdGg\">Mary Moody<\/a>, respectively.<\/p>\n<p>But it wasn\u2019t just high-profile actors like them whose bodies were being used. Jennifer was \u201ca very minor performer,\u201d she says. \u201cIf it happened to me, I feel like it could happen to anybody who\u2019s shot porn.\u201d Since he started his practice in 2006, Silverstein says, \u201cnumerous clients\u201d have reached out to report \u201c<em>This is my body on so-and-so<\/em>.\u201d\u00a0<\/p>\n<p>Both people whose faces appear in NCII deepfakes and those whose bodies are used this way can feel serious distress. Experts call this type of damage \u201cembodied harms,\u201d says Anne Craanen, who researches gender-based violence at the UK\u2019s Institute for Strategic Dialogue, an organization that analyzes extremist content, disinformation, and online threats.\u00a0<\/p>\n<p>The term reflects the fact that even though the content exists in the virtual realm, it can cause physiological effects, including body dysmorphia. The face-swapped entity occupies the uncanny valley, distorting self-perception. After discovering their faces in sexual deepfakes, many people feel silenced, experts told me; they may \u201cself-censor,\u201d as Craanen puts it, and step back from public-facing life. Allison Mahoney, an attorney who works with abuse survivors, says that people whose faces appear in NCII can experience depression, anxiety, and suicidal ideation: \u201cI\u2019ve had multiple clients tell me that they don\u2019t sleep at night, that they\u2019re losing their hair.\u201d\u00a0<\/p>\n<figure class=\"wp-block-pullquote alignright\">\n<blockquote>\n<p>Independent creators aren\u2019t just \u201chaving sex on camera.\u201d For someone to rip off their work \u201cfor their own entertainment or financial gain fucking sucks.\u201d<\/p>\n<\/blockquote>\n<\/figure>\n<p>Though the impact on people whose <em>bodies<\/em> are used hasn\u2019t been discussed or studied as often, Jennifer says that \u201cit\u2019s just a really terrible feeling, knowing that you are part of somebody else\u2019s abuse.\u201d She sees it as akin to \u201ca new form of sexual violence.\u201d<\/p>\n<p>The uncertainty that comes with not being aware of what your body is doing online can be highly unsettling. Like Jennifer, many adult actors don\u2019t really know what\u2019s out there. But some devoted followers know the actors\u2019 bodies well\u2014often recognizing tattoos, scars, or birthmarks\u2014and \u201cvery quickly they bring [deepfakes] to the adult performer\u2019s attention,\u201d says Silverstein. Or performers will stumble upon the content by chance; some 20 years ago, for instance, the first such client to tell Silverstein her body was being used in a deepfake happened to be searching Nicole Kidman online when she found that one of the results showed Kidman\u2019s face on her porn. \u201cShe was devastated, obviously, because they took her body,\u201d he says, \u201cand they were monetizing it.\u201d\u00a0<\/p>\n<p>Otherwise, this imagery may be found by an organization like Takedown Piracy, one of several copyright enforcement companies serving adult content creators. US copyright violations can be challenging to prove if someone\u2019s body lacks distinguishing features, says Reba Rocket, Takedown Piracy\u2019s chief operating and marketing officer. But Rocket says her team has added digital fingerprinting technology to clients\u2019 material to help flag and remove problematic videos, often finding them before clients realize they\u2019re online.\u00a0<\/p>\n<p>By capturing \u201ctens of thousands of tiny little visual data points\u201d from videos, digital fingerprinting creates unique corresponding files that can be used to identify them, Rocket says\u2014kind of like an invisible watermark. The prints remain even if pirates alter the videos or replace performers\u2019 faces. Takedown Piracy has digitally fingerprinted more than half a billion videos and the organization has gotten <a href=\"https:\/\/transparencyreport.google.com\/copyright\/reporters\/1620?hl=en\">130 million copyrighted videos taken down from Google<\/a> alone (though, of those videos, Rocket hasn\u2019t tracked how many of these specifically include someone else\u2019s face on a performer\u2019s body).\u00a0<\/p>\n<p>Besides copyright, a range of legal tools can be used to try and combat NCII, says Eric Goldman, a law professor at Santa Clara University. For example, victims can claim invasion of privacy. But using these tools isn\u2019t particularly straightforward, and they may not even apply when it comes to someone\u2019s body. If there aren\u2019t, for instance, unique markers indicating that a body in a deepfake belongs to the person who says it does, US law \u201cdoesn\u2019t really treat [this content] as invasion of privacy,\u201d Goldman says, \u201cbecause we don\u2019t know who to attribute it to.\u201d<\/p>\n<p>In a <a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=2953747\">2018 study that reviewed<\/a> \u201cjudicial resolution\u201d of cases involving NCII, Goldman found that one successful way plaintiffs were able to win cases was to assert \u201cintentional affliction of emotional distress.\u201d But again, that hinges on the ability to clearly identify the person in the content. Relevant statutes, he adds, might also require \u201cintent to harm the individual,\u201d which may be hard to show for people whose bodies alone are featured.<\/p>\n<h3 class=\"wp-block-heading\">\u201cAI girls will do whatever you want\u201d<\/h3>\n<p>In the last few years, Silverstein says, it\u2019s become less and less common to see the bodies of real adult content creators in deepfakes, at least in a way that makes them clearly identifiable.\u00a0<\/p>\n<p>Sometimes the bodies have been manipulated using AI or simpler editing tools. This can be as basic as erasing a birthmark or changing the size of a body part\u2014minor edits that make it impossible to identify someone\u2019s image beyond a reasonable doubt, so even porn actors who can tell that an altered image used their body as a base won\u2019t get very far in the legal realm. \u201cA lot of people are like, <em>That looks like my body<\/em>,\u201d says Silverstein, but when he asks them how, they\u2019ll reply, <em>It just does<\/em>.\u00a0<\/p>\n<p>At the same time, other users are now creating NCII with wholly AI-generated bodies. In \u201cnudify\u201d apps, anyone with a minimal grasp of technology can upload a photo of someone\u2019s clothed body and have it replaced with a fake naked one. \u201cSo [much] of this content being created is just someone\u2019s face on an AI body,\u201d Silverstein says.<\/p>\n<p>Such apps have drawn a ton of attention recently, from <a href=\"https:\/\/www.techpolicy.press\/tracking-regulator-responses-to-the-grok-undressing-controversy\/\">Grok \u201cnudifying\u201d minors<\/a> to Meta <a href=\"https:\/\/www.404media.co\/instagram-ads-send-this-nudify-site-90-percent-of-its-traffic\/?ref=daily-stories-newsletter\">running ads for<\/a>\u2014and then <a href=\"https:\/\/www.404media.co\/meta-sues-nudify-app-that-keeps-advertising-on-instagram\/?ref=daily-stories-newsletter\">suing<\/a>\u2014the nudify app Crushmate. But there\u2019s been relatively little attention paid to the content being used to train them. They almost certainly draw on the <a href=\"https:\/\/pmc.ncbi.nlm.nih.gov\/articles\/PMC9252028\/\">more than 10,000 terabytes<\/a> of online porn, and performers have virtually zero recourse.\u00a0<\/p>\n<p>One reason is that creators aren\u2019t able to demonstrate with any certainty that their content <em>is<\/em> being used to train AI models like those used by nudify apps. \u201cThese things are all a black box,\u201d says Hany Farid, a professor at the University of California, Berkeley, who specializes in digital forensics. But \u201cgiven the ubiquity\u201d of adult content, he adds, it\u2019s a \u201creasonable assumption\u201d that online porn is being used in AI training.\u00a0<\/p>\n<p>\u201cIt\u2019s just not at all difficult to come up with pornographic data sets on the internet,\u201d says Stephen Casper, a computer science PhD student at MIT who researches deepfakes. What\u2019s more, he says, plenty of shadowy online communities provide \u201cuser guides\u201d on how to use this data to train AI, and in particular programs that generate nudes.\u00a0<\/p>\n<p>It\u2019s not certain whether this activity falls within the US legal definition of \u201cfair use\u201d\u2014an issue that\u2019s <a href=\"https:\/\/www.reuters.com\/legal\/government\/ai-copyright-battles-enter-pivotal-year-us-courts-weigh-fair-use-2026-01-05\/\">currently being litigated in several lawsuits<\/a> from other types of content creators\u2014but Casper argues that even if it does, it\u2019s ethically murky for porn created by consenting adults 10 years ago to wind up in those training data sets. When people \u201chave their stuff used in a way that doesn\u2019t respect or reflect reasonable expectations that they had at that time about what they were creating and how it would be used,\u201d he says, there\u2019s \u201ca legitimate sense in which it\u2019s kind of \u2026 nonconsensual.\u201d\u00a0<\/p>\n<p>Adult performers who started working years ago couldn\u2019t possibly have consented to AI anything; Jennifer calls AI-related risks \u201cretroactively placed.\u201d Contracts that porn actors signed before AI, adds Silverstein, might provide that \u201cthe publisher could do anything with the content using technology that now exists or here and after will be discovered.\u201d That felt more innocuous when producers were talking about the shift from VHS to DVD, because that didn\u2019t change the content itself, just the way it was conveyed. It\u2019s a far different prospect for someone to use your content to train a program to create <em>new<\/em> content \u2026 content that could replace your work altogether.\u00a0<\/p>\n<p>Of course, this all affects creators\u2019 bottom line\u2014not unlike the way Google\u2019s AI overviews affect revenue for online publishers <a href=\"https:\/\/digiday.com\/media\/google-ai-overviews-linked-to-25-drop-in-publisher-referral-traffic-new-data-shows\/\">who\u2019ve stopped getting clicks<\/a> when people are content with just reading AI-generated summaries. Performers\u2019 \u201cconcern is \u2026 it\u2019s another way to pirate [their] content,\u201d says Rocket.\u00a0<\/p>\n<p>After all, independent creators aren\u2019t just \u201chaving sex on camera,\u201d as the adult content creator Allie Eve Knox says. They\u2019re paying for filming equipment and location rentals, and then spending hours editing and marketing. For someone to then rip off and distort that content \u201cfor their own entertainment or financial gain,\u201d she says, \u201cfucking sucks.\u201d\u00a0<\/p>\n<div class=\"wp-block-image\">\n<figure class=\"wp-block-image size-large\"><img fetchpriority=\"high\" fetchpriority=\"high\" decoding=\"async\" height=\"2000\" width=\"2000\" src=\"https:\/\/wp.technologyreview.com\/wp-content\/uploads\/2026\/05\/Hoeckele-deepfake-02_5e2848.jpg?w=2000\" data-orig-src=\"https:\/\/wp.technologyreview.com\/wp-content\/uploads\/2026\/05\/Hoeckele-deepfake-02_5e2848.jpg?w=2000\" alt=\"\" class=\"lazyload wp-image-1137187\" srcset=\"data:image\/svg+xml,%3Csvg%20xmlns%3D%27http%3A%2F%2Fwww.w3.org%2F2000%2Fsvg%27%20width%3D%272000%27%20height%3D%272000%27%20viewBox%3D%270%200%202000%202000%27%3E%3Crect%20width%3D%272000%27%20height%3D%272000%27%20fill-opacity%3D%220%22%2F%3E%3C%2Fsvg%3E\" data-srcset=\"https:\/\/wp.technologyreview.com\/wp-content\/uploads\/2026\/05\/Hoeckele-deepfake-02_5e2848.jpg 3000w, https:\/\/wp.technologyreview.com\/wp-content\/uploads\/2026\/05\/Hoeckele-deepfake-02_5e2848.jpg?resize=150,150 150w, https:\/\/wp.technologyreview.com\/wp-content\/uploads\/2026\/05\/Hoeckele-deepfake-02_5e2848.jpg?resize=300,300 300w, https:\/\/wp.technologyreview.com\/wp-content\/uploads\/2026\/05\/Hoeckele-deepfake-02_5e2848.jpg?resize=768,768 768w, https:\/\/wp.technologyreview.com\/wp-content\/uploads\/2026\/05\/Hoeckele-deepfake-02_5e2848.jpg?resize=2000,2000 2000w, https:\/\/wp.technologyreview.com\/wp-content\/uploads\/2026\/05\/Hoeckele-deepfake-02_5e2848.jpg?resize=1536,1536 1536w, https:\/\/wp.technologyreview.com\/wp-content\/uploads\/2026\/05\/Hoeckele-deepfake-02_5e2848.jpg?resize=2048,2048 2048w\" data-sizes=\"auto\" data-orig-sizes=\"(max-width: 2000px) 100vw, 2000px\"><\/p>\n<div class=\"image-credit\">KIM HOECKELE<\/div>\n<\/figure>\n<\/div>\n<p>Tanya Tate, a longtime adult content creator, tells me about another highly unsettling AI-created situation: She was recently chatting with a fan on Mynx, a sexting app, when he asked her if she knew him. She told him no, and \u201chis eyes just started watering,\u201d Tate says. He was upset because he thought she <em>did<\/em> know him. Turns out he\u2019d sent $20,000 to a scammer who\u2019d used an AI-generated deepfake of Tate to seduce him.\u00a0<\/p>\n<p>Several men, Tate subsequently learned, had been scammed by an AI version of her, and some of them began blaming her for their losses and posting false statements about her online. When she reported one particularly aggressive harasser to the police, they told her he was exercising his \u201cfreedom of speech,\u201d she says. Rocket, too, is familiar with situations where AI is used to take advantage of fans. \u201cThe actual content creator will get nasty emails from these people who\u2019ve been scammed,\u201d she says.<\/p>\n<p>Other porn actors say they fear that their likenesses have been used without consent to do other things they wouldn\u2019t do. One, Octavia Red, tells me she doesn\u2019t do anal scenes, \u201cbut I\u2019m sure there\u2019s tons of deepfake anal videos of me that I didn\u2019t consent to.\u201d That could cost her, she fears, if viewers choose to watch those videos instead of subscribing to her websites. And it could cause fans to develop false expectations about what kind of porn she\u2019ll create.<\/p>\n<p>\u201cI saw one AI creator saying, \u2018Well, AI girls will do whatever you want. They don\u2019t say no,\u2019\u201d says Rocket. \u201cThat horrifies me \u2026 especially if they\u2019re training those AI models on real people. I don\u2019t think they understand the damage to mental health or reputation that that can create. And once it\u2019s on the internet, it\u2019s there forever.\u201d\u00a0<\/p>\n<h3 class=\"wp-block-heading\">Efforts to \u201cscrub adult content from the internet\u201d<\/h3>\n<p>As AI technology improves, it\u2019s increasingly difficult for people to discern any type of real video from the best AI-generated ones on their own. In one 2025 <a href=\"https:\/\/www.nature.com\/articles\/s41598-025-94170-3\">study<\/a>, UC Berkeley\u2019s Farid found that participants correctly identified AI-generated voices about 60% of the time (not much better than random chance), while advances like <a href=\"https:\/\/www.ischool.berkeley.edu\/news\/2025\/hany-farid-reflects-new-research-showing-deepfake-makers-can-now-recreate-realistic\">false heartbeats<\/a> make AI-generated humans tougher than ever to spot.<\/p>\n<p>Nevertheless, most lawyers and legal experts I spoke with said copyright laws are still adult performers\u2019 best bet in the US legal system, at least for getting their face-swapped content taken down. For his clients, Silverstein says, he tries to figure out the content\u2019s origins and then issue takedown requests under the <a href=\"https:\/\/www.copyright.gov\/dmca\/\">Digital Millennium Copyright Act<\/a>, a 1998 law that adapted copyright law for the internet era. \u201cEven recently, I had a performer who has an insanely well-known tattoo,\u201d he says, and with a DMCA subpoena he managed to identify the poster of the content, who voluntarily removed it.\u00a0<\/p>\n<p>But this way of working is becoming increasingly rare.<\/p>\n<p>These days it\u2019s nearly \u201cimpossible,\u201d Silverstein says, to determine who produced a deepfake, because many platforms that host pirated content operate facelessly. They\u2019re also often based in places that \u201cdon\u2019t really care about US law when it comes to copyrights,\u201d says Rocket\u2014places like Russia, the Seychelles, and the Netherlands.\u00a0<\/p>\n<p>While governments in <a href=\"https:\/\/www.politico.eu\/article\/eu-grok-x-elon-musk-ai-nudification-ban-in-wake-of-scandal\/\">the EU<\/a>, the <a href=\"https:\/\/www.bbc.co.uk\/news\/articles\/cq8dp2y0z7wo\">UK<\/a>, and <a href=\"https:\/\/minister.infrastructure.gov.au\/wells\/media-release\/taking-stand-against-abusive-technology\">Australia<\/a> have said they will <a href=\"https:\/\/www.bbc.com\/news\/articles\/cq8dp2y0z7wo\">ban or restrict access to nudify apps<\/a>, it\u2019s <a href=\"https:\/\/www.macworld.com\/article\/3116379\/report-apple-app-store-fails-to-protect-users-from-nudify-apps.html\">not an easily executed proposition<\/a>. As Craanen notes, when app stores remove these services, they often simply reappear under different names, providing the same services. And social platforms where people share NCII deepfakes, argues Rocket, are slacking in getting them removed. \u201cIt\u2019s endless, and it\u2019s ridiculous, because places like Twitter and <a href=\"https:\/\/about.fb.com\/news\/2019\/08\/open-source-photo-video-matching\/#:~:text=Today%2C%20we%20are%20open-sourcing,different%20types%20of%20harmful%20content.\">Facebook have the same technology<\/a> we do,\u201d Rocket says. \u201cThey can identify something as an infringement instantly, but they choose not to.\u201d <\/p>\n<p>(Apple spokesperson Adam Dema emailed, \u201c\u2019nudification\u2019 apps are against our guidelines\u201d in the app store, and it has \u201cproactively rejected many of these apps and removed many others,\u201d flagging a reporting <a href=\"https:\/\/reportaproblem.apple.com\/\">portal<\/a> for users. A Google spokesperson emailed, \u201cGoogle Play <a href=\"https:\/\/support.google.com\/googleplay\/android-developer\/answer\/9878810#:~:text=EXPAND%20ALL-,Sexual%20Content,-and%20Profanity\">does not allow<\/a> apps that contain sexual content,\u201d noting it takes \u201cproactive steps to detect and remove apps with harmful content\u201d and has suspended hundreds of apps for violating its policy. Meta spokesperson <a href=\"http:\/\/google.com\/url?sa=D&amp;q=https:\/\/about.fb.com\/news\/2025\/06\/taking-action-against-nudify-apps\/&amp;ust=1778792160000000&amp;usg=AOvVaw1U22px2nTkgrMYNYWc6dSh&amp;hl=en\">shared a blog post<\/a> about actions it\u2019s taken against nudify apps, but did not respond to follow-up questions about copyrighted material. X did not respond to a request for comment.)<\/p>\n<p>As porn performers are forced to navigate AI-related threats, the only current federal law to address deepfakes may not help them much\u2014and could even make matters worse. The <a href=\"https:\/\/www.congress.gov\/bill\/119th-congress\/senate-bill\/146\">Take It Down Act<\/a>, which became US law last year, criminalizes publishing NCII and requires websites to remove it within 48 hours. But, as Farid notes, people could weaponize the measure by reporting porn that was made legally and with consent and claiming that it\u2019s NCII. This could result in the content\u2019s removal, which would hurt the performers who made it. Santa Clara\u2019s Goldman points to Project 2025, the Heritage Foundation\u2019s policy blueprint for the second Trump administration, which aims <a href=\"https:\/\/static.heritage.org\/project2025\/2025_MandateForLeadership_FULL.pdf\">to wipe porn from the web<\/a>. The Take It Down Act, he argues, \u201callows for the coordinated effort to scrub adult content from the internet.\u201d\u00a0<\/p>\n<p>US lawmakers have a history of hurting sex workers in their attempts to regulate explicit content online. State-level age verification laws are an example; visitors can pretty easily get around these measures, but they can still result in reduced revenue for adult performers (because of <a href=\"https:\/\/19thnews.org\/2025\/09\/age-verification-queer-adult-industry-workers\/\">lower traffic<\/a> to those sites and the high price of age-checking services they have to purchase).\u00a0<\/p>\n<p>\u201cThey\u2019re always doing something to fuck with the porn industry, but not in a way that actually helps sex workers,\u201d says Jennifer. \u201cIf they do something, they\u2019re taking away your income again\u2014as opposed to something like giving you more rights to your image, [which] would be tremendously helpful.\u201d\u00a0<\/p>\n<p>But as generative AI plays an increasingly large role in NCII deepfakes, the types of images to which adult performers have rights moves deeper into a gray area. Can actors lay claim to AI images likely trained on their bodies? How about AI-generated videos that impersonate them, like the one that tricked Tanya Tate\u2019s fan?<\/p>\n<p>The biggest challenge will be creating \u201clegitimate, effective laws that will absolutely protect content creators from abusing their likeness to train and create AI,\u201d Rocket says. \u201cAbsent that, we\u2019re just going to have to keep pulling content down from the internet that\u2019s fake.\u201d<\/p>\n<p>In the meantime, a few porn actors tell me, they\u2019re trying to take advantage of copyright laws that weren\u2019t really made for them; they\u2019ve signed with platforms that host their AI-generated duplicates, with whom fans pay to chat, in part so they\u2019ll have contracts that protect ownership of their AI likenesses. When I spoke with the actor Kiki Daire in September 2025 for <a href=\"https:\/\/www.pcmag.com\/articles\/the-ai-adult-chatbot-boom-came-early-and-finished-fast\">a story on adult creators\u2019 \u201cAI twins<\/a>,\u201d she said she \u201cown[ed] her AI\u201d because she\u2019d signed a contract with Spicey AI, a site that hosted AI duplicates of adult performers. If another company or person created her AI-generated likeness, she added, \u201cI have a leg to stand on, as far as being able to shut that down.\u201d\u00a0\u00a0<\/p>\n<p>Even this, though, is not a sure thing; Spicey AI, for instance, shut down several months after I spoke with Daire, so it\u2019s unlikely that her contract would hold. And when I spoke in October with Rachael Cavalli, another adult actor who had signed with an AI duplicate site in hopes it\u2019d help protect her AI image, she admitted, \u201cI don\u2019t have time to sit around and look for companies that have used my image or turned something into a video that I didn\u2019t actually do \u2026 it\u2019s a lot of work.\u201d In other words, having rights to your AI image on paper doesn\u2019t make it easier to track down all the potentially infinite breaches of those rights online.<\/p>\n<p>If she\u2019d known what she knows about technology today, Jennifer says she doesn\u2019t think she would have done porn. The risks have increased too much, and too unpredictably. She now does in-person sex work; it\u2019s \u201cnot necessarily safer,\u201d she says, \u201cbut it\u2019s a different risk profile that I feel more equipped to manage.\u201d\u00a0<\/p>\n<p>Plus, she figures AI is unlikely to replace in-person sex workers the way it could porn actors: \u201cI don\u2019t think there\u2019s going to be stripper robots.\u201d\u00a0<\/p>\n<p><em>Jessica Klein is a Philadelphia-based freelance journalist covering intimate partner violence, cryptocurrency, and other topics.<\/em><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>When Jennifer got a job doing research for a nonprofit  [&#8230;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"content-type":"","footnotes":""},"categories":[226],"tags":[],"class_list":["post-22869","post","type-post","status-publish","format-standard","hentry","category-technology"],"acf":[],"_links":{"self":[{"href":"https:\/\/ideainthebox.com\/index.php\/wp-json\/wp\/v2\/posts\/22869","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ideainthebox.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ideainthebox.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ideainthebox.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/ideainthebox.com\/index.php\/wp-json\/wp\/v2\/comments?post=22869"}],"version-history":[{"count":0,"href":"https:\/\/ideainthebox.com\/index.php\/wp-json\/wp\/v2\/posts\/22869\/revisions"}],"wp:attachment":[{"href":"https:\/\/ideainthebox.com\/index.php\/wp-json\/wp\/v2\/media?parent=22869"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ideainthebox.com\/index.php\/wp-json\/wp\/v2\/categories?post=22869"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ideainthebox.com\/index.php\/wp-json\/wp\/v2\/tags?post=22869"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}