This year’s 11% spike in sexual offences is primarily due to image-based abuse and cyberflashing, according to the latest figures from the ONS.
The police have recorded over 13000 cases of revenge porn and digital sexual abuse since 2024, but experts warn that the true scale of the problem is being obscured by laws that have failed to keep pace with the technology.
Cyberflashing and revenge porn were criminalised as part of the 2023 Online Safety Act.
Services such as the Revenge Porn Hotline provide practical support for victims of digital image abuse, including reporting unconsented content to platforms and carrying out further searches.
Jessica Yelland, a revenge porn practitioner at the hotline, feels the law is not up to date with the current technology.
While it is illegal to share or threaten to share images without consent, there is nothing that prevents the creation of illicit images using AI.
She said: “We have had a few cases where someone has said they’ve found thousands of AI-generated images of them on a friend’s computer, but it wouldn’t fall under intimate image abuse, which is very scary.
“While it may fall under harassment, there is little we can do in these situations.”
The hotline has a 90% removal rate. When a victim comes to them with a link, they will contact the website/host on their behalf, explaining that the law has been broken and requesting the removal.
Large pornography sites like Pornhub have been widely criticised for being a hotbed of revenge porn, underage pornography and sex trafficking.
Part of the issue lies in the fact that content can be directly downloaded and reuploaded to the site, following victims, even if the content is initially removed.
A 2022 lawsuit regarding abuse claims led Visa and Mastercard to remove themselves from Pornhub’s advertising content.
This came after an investigation from the New York Times found that for years the website has been hosting non-consensual and underage videos – including those with children.
An investigation in 2020 found that they only employ 80 content moderators, compared to Facebook’s 15,000.
However, Yelland stressed that it is not mainstream pornography websites that pose the greatest threat.
The issue is collector culture, people online who trade pictures with each other on forums or smaller, foreign sites, sometimes purpose-built. In these cases, there is no incentive to remove content.
She said: “Of course, image-based does not exist in a vacuum; it goes hand in hand with domestic abuse, honour-based violence and other forms of sexual abuse, which disproportionately affect women and girls.
“A lot of sexual abuse is recorded without consent, so it’s all intertwined.”
Rebecca Hitchen from End Violence Against Women pointed out the justice system’s poor track record on other forms of violence against women and girls.
She questioned the extent to which the police, CPS and courts have the preparation and resources to respond to these forms of online offending.
They are calling on the government to introduce civil redress for survivors, meaning a system by which a court can order the removal and deletion of abusive content and award damages to survivors.
Yesterday, the government published it’s long awaited strategy on violence against women and children (VWAGA), the first of its kind under the Labour government.
Backed by £1billion in funding, one of its key highlights is an innovative approach to prevention, including the introduction of a new curriculum that tackles misogyny in schools.
This will include sending experts to secondary schools to educate children about consent and the dangers of sharing explicit images online.
The strategy also includes the banning of nudification apps.
The Department for Science, Innovation and Technology (DSIT) have been approached for comment.
Featured image credit: Towfiqu Barbhuiya via Unsplash






Join the discussion