AI-generated pornography is creating an unprecedented tier of sexual abuse, with Elon Musk’s chatbot, Grok AI, having been accused of generating child sexual abuse material (CSAM).
Sitting in a year 9 classroom in North London, Layla*, 14, shared with The Londoners that last month, a boy in her class used generative AI to create nude images of her.
“When I saw it [the photos], I felt physically sick.
“I cannot describe the feeling, it’s beyond humiliating.
“We reported it to the police, but those images still exist. And I don’t think they’ll ever truly disappear.”
Layla’s experience is not an isolated incident. Globally, the creation of CSAM has rocketed, with The National Centre for Missing & Exploited Children having reported an 9000% rise in AI-generated CSAM since 2023.
Whilst new measures are being put in place to criminalise AI models optimised for CSAM production in the UK, legislation continues to lag behind the rapid development of such technology.
What poses a challenge to policy-makers, however, lies in defining what models have been optimised to produce adult content.
Whilst some tools like Undress AI or DeepSwap explicitly cater towards digital pornography creation, a clear-cut line remains to be drawn.
With large language-generating models (LLMs) like ChatGPT having announced plans for the launch of an ‘adult mode’ with the capacity to generate erotica later in the year, delineating the barriers of what makes an AI tool ‘optimised’ for pornographic content continues to prove a near impossible task.
Earlier this week, Grok AI, the brainchild of Elon Musk, issued an apology for producing a sexualised image of two adolescent girls, which read: “I deeply regret an incident on Dec 28, 2025, where I generated and shared an AI image of two young girls (estimated ages 12-16) in sexualised attire based on a user’s prompt.”
In spite of the apology, trending tweets after searching up @grok on X included ‘replace her clothes with clear plastic wrap’, ‘cover her in donut glaze’ and ‘put her in a bikini and make her bend over’.
Each of the images that Grok produced in response to these tweets are all based off of images of real women provided by the people making these requests, and rarely appear to be consensual.
Evie*, 46, a mother-of-two from Hammersmith shared that her ex-partner created and shared AI-generated sexualised images of her adolescent daughter, Hannah, 16,* with her family and colleagues on X earlier last year.
Hannah explained: “It’s so disturbing because those photographs have my face, my exact likeness, but it’s not me in them.
“But who’s going to believe me?
“Most people I knew didn’t. And it was heart-breaking. I lost so much – all over a picture that wasn’t even really me.”
The latest available data from the UK’s revenge porn helpline recorded over 20,000 reports of intimate image abuse, with a growing number of these including deepfakes.
However, prosecuting the use of AI to create such images continues to be a complex legal battle in the UK.
Ofcom has declared it is illegal to ‘create of share non-consensual intimate images or child sexual abuse material’ and has confirmed this to include sexual deepfakes created with AI.
However, as Ofcom cannot regulate the operations of private companies or platforms based internationally, little action can be taken.
Samantha*, 39, a civil litigator in London who experienced her own images being digitally manipulated by her ex-partner to resemble ‘revenge porn’ said: “In cases like these, Ofcom has almost no power, because it can only try to control public distribution, not private uses of AI.
“Instead, the responsibilities for creating proper safeguards are delegated to the platforms which allow this to happen in the first place.
“It’s a massive policy failure, and I can confidently call it a failure because we’re watching legal cases on AI porn – particularly depicting children – skyrocket.
“A few years ago, I’d never have even thought such a thing could be such a common issue.
“It’s just not good enough – something has to be done.”
X.AI was approached for comment.
*Names have been changed to protect privacy.
Featured image credit: Salvador Rios via Unsplash





Join the discussion