Reed NewsReed News

AI child abuse imagery surged 14% in 2025, IWF says

Crime & justiceCrime
AI child abuse imagery surged 14% in 2025, IWF says
Key Points
  • AI-generated child sexual abuse imagery surged 14% in 2025, with over 8,000 AI-generated images and videos identified by the IWF.
  • Websites hosting child sexual abuse material doubled to 15,031, and sextortion cases rose to 397.
  • Perpetrators are sharing custom AI models on the dark web; one creator was thanked over 3,000 times for a 30-minute AI-generated abuse video.

The Internet Watch Foundation (IWF) reported a 14% surge in AI-generated child sexual abuse imagery in 2025, identifying more than 8,000 AI-generated images and videos from user reports. Over 3,400 of these were hyper-realistic full-motion videos, and more than 65% depicted the most severe forms of abuse, including rape, sexual torture, and bestiality. By contrast, only 43% of non-AI-generated sexual content fell under the most extreme categories.

Kerry Smith, CEO of the IWF, said: "We now face a technological landscape that can generate infinite violations with unprecedented ease." An IWF report noted that "single applications can now generate abusive imagery with minimal effort, removing the need for technical expertise and significantly lowering barriers to entry."

We now face a technological landscape that can generate infinite violations with unprecedented ease.

Kerry Smith, CEO of the Internet Watch Foundation

The number of websites hosting child sexual abuse material doubled in a year, from 7,028 in 2024 to 15,031 in 2025, according to the IWF. The organization identified and digitally marked 317,101 images of child abuse in 2025. Around 16% of the sites (2,458) were disguised to look legitimate or inactive but had hidden access for paedophiles.

Sextortion cases also rose sharply, with the IWF recording 397 cases in 2025, up from 175 in 2024. Perpetrators are developing and sharing custom AI models and databases on the dark web. One creator was thanked over 3,000 times for creating a 30-minute AI-generated sexual abuse video.

Single applications can now generate abusive imagery with minimal effort, removing the need for technical expertise and significantly lowering barriers to entry.

IWF report, Organization

Smith added: "It is clear criminals are exploiting systemic failures and are finding it far too easy to reap huge profits from children’s sexual exploitation. At every stage, we need to disrupt this system. It is an industry." She called for mandatory measures on financial services to detect and report digital payment links for child sexual abuse material, and for companies using end-to-end encryption to adopt safety tools to prevent criminals from using these platforms as safe havens.

The IWF cautioned that its data provides only a partial view, and the actual amount of AI-generated CSAM is likely significantly greater. The total number of AI-generated items online beyond what the IWF has identified remains unknown, as does the number of children victimized in the content. The effectiveness of current detection tools in distinguishing AI-generated from real CSAM is also unclear.

It is clear criminals are exploiting systemic failures and are finding it far too easy to reap huge profits from children’s sexual exploitation. At every stage, we need to disrupt this system. It is an industry.

Kerry Smith, CEO of the Internet Watch Foundation

Recent convictions highlight the ongoing threat. Joao-Carlos Jardim Dos Santos Teixeira was jailed for 11 years and 4 months for running online child abuse chat groups, and William Yates was jailed for 5 years and 4 months for operating a membership-only forum for child sexual abuse.

Chris Sherwood, CEO of the NSPCC, said: "The growing number of commercial child sexual abuse sites uncovered by the Internet Watch Foundation lays bare a severe problem."

We need mandatory measures on financial services to proactively detect, take down and report digital payment links for the sale of images and videos of child sexual abuse.

Kerry Smith, CEO of the Internet Watch Foundation

We also need to see companies which use end-to-end encryption on their services adopt the tried and trusted safety tools which can prevent criminals using these platforms as safe havens to distribute child sexual abuse material.

Kerry Smith, CEO of the Internet Watch Foundation

The growing number of commercial child sexual abuse sites uncovered by the Internet Watch Foundation lays bare a severe problem, with

Chris Sherwood, CEO of the NSPCC
Tags
Corroborated
The Independent - UK NewsDaily Mirror - NewsEuronewsThe Guardian - Main UK
4 publications
View transparency reportReport inaccuracy
AI child abuse imagery surged 14% in 2025, IWF says | Reed News