The Home Office data reveals that this is part of a larger problem across England and Wales, with 38,685 such offences recorded in 2023/24, averaging more than 100 every day.
The NSPCC has raised concerns about the use of private messaging platforms by criminals to share child sexual abuse material.
A separate Freedom of Information request by the NSPCC found that half of these crimes took place on Snapchat, with Meta products accounting for a quarter – 11 per cent on Instagram, seven per cent on Facebook, and 6 per cent on WhatsApp.
In response to this data, the NSPCC, along with other charities including the Marie Collins Foundation, Lucy Faithfull Foundation, Centre of Expertise on Child Sexual Abuse, and Barnardo’s, have sent a joint letter to Home Secretary Yvette Cooper and Secretary of State for Science, Innovation, and Technology Peter Kyle.
The letter expresses concern over Ofcom’s final Illegal Harms Code of Practice, published in December 2024.
The charities argue that the current code does not adequately protect children from abuse on private messaging services, despite this being a core aim of the Online Safety Act.
Ofcom has stated that user-to-user services are only required to remove illegal content where it is ‘technically feasible’, which the charities believe creates a loophole allowing some services to avoid providing basic protections for children.
The NSPCC is calling on the UK Government to push Ofcom to review and strengthen their codes of practice to tackle this threat to children’s safety online.
The charity is also urging private messaging services, including those using end-to-end encryption, to ensure robust safeguards are in place to prevent their platforms from becoming ‘safe havens’ for perpetrators of child sexual abuse.
End-to-end encryption is a secure communication system where only communicating users can participate, meaning service providers can be unaware of child sexual abuse material being shared through their platform.
Childline has provided further evidence of how young people are being targeted or blackmailed to share child abuse images via private messaging apps.
In 2023/24, Childline delivered 903 counselling sessions to children and young people relating to blackmail or threats to expose or share sexual images online, a 7 per cent increase compared to 2022/23.
One 13-year-old girl said: “I sent nude pics and videos to a stranger I met on Snapchat.
“I think he’s in his thirties.
“I don’t know what to do next.
“I told him I didn’t want to send him any more pictures and he started threatening me, telling me that he’ll post the pictures online.
“I’m feeling really angry with myself and lonely.
“I would like support from my friends, but I don’t want to talk to them about it as I’m worried about being judged.”
Chris Sherwood, NSPCC Chief Executive, said: “It is deeply alarming to see thousands of child sexual abuse image crimes continue to be recorded by the Metropolitan Police in London.
“These offences cause tremendous harm and distress to children, with much of this illegal material being repeatedly shared and viewed online.
“It is an outrage that in 2025 we are still seeing a blatant disregard from tech companies to prevent this illegal content from proliferating on their sites.
“Having separate rules for private messaging services lets tech bosses off the hook from putting robust protections for children in place.
“This enables crimes to continue to flourish on their platforms even though we now have the Online Safety Act.
“The Government must set out how they will take a bold stand against abuse on private messaging services and hold tech companies accountable for keeping children safe, even if it requires changes to the platform’s design – there can be no excuse for inaction or delay.”