Youth Justice Board data indicates the number of children committing sexual offences is on the rise, with proven offences seeing a 6% increase in 2025 for the third year in a row. According to HuffPost UK, Jessica Wilson, managing director at Eventum Legal, described this data as painting a troubling picture about today’s youth. The specific factors driving this increase remain unclear, as do the numbers of children arrested or cautioned for these offences in 2025.
In the year ending March 2025, 54% of proven sexual offences, around 800 cases, were related to indecent or extreme pornographic images or videos, including creating and sharing material. It is illegal to create, possess, or share sexual images of anyone under 18, even when the person sharing is also a child. It is a crime to use AI to create, possess, or distribute indecent images of children.
This data paints a troubling picture about today’s youth.
The percentage of these offences involving AI-generated content has not been confirmed. Internet Matters research found that almost half of teenagers aged 13-16 have heard about abuse of another young person’s sexual image. According to HuffPost UK, Ghislaine Bombusa, content and digital director at Internet Matters, described child sexual abuse material as including any content that shows or causes the sexual abuse or exploitation of a child, such as sexual images a young person creates of themselves, which can spread uncontrollably and cause long-lasting harm.
A staggering 99% of nude deepfakes feature women and girls, highlighting a disproportionate impact. According to HuffPost UK, Jessica Wilson also noted that most young people do not understand that sharing explicit content of peers is a sexual offence, often leading to serious consequences. Legal actions or interventions to address this trend are unspecified, and comparisons to adult sexual offence statistics are not provided.
Child sexual abuse material (CSAM) includes any content that shows or causes the sexual abuse or exploitation of a child. This can include sexual images that a young person creates of themselves. Once shared, these images can quickly spread beyond their control and be difficult to remove, causing serious and long-lasting harm.
Most young people don’t know or understand that sharing explicit content of their peers is considered a sexual offence. This explicit content can often emerge without them understanding the risks associated with it, leading to serious consequences.