Child sex abuse image crimes logged by UK police forces have increased by almost 10 per cent over the past year, sparking renewed calls for technology companies to take decisive action in blocking the capture and sharing of nude images on children’s devices.
The NSPCC warned that young people continue to face significant exposure to the risks of grooming, extortion, online abuse, and the non-consensual sharing of intimate images.
The charity’s recent research underscores the persistent threat.
Between April 1 2024 and March 31 2025, a total of 36,829 offences involving indecent and prohibited images of children were recorded across the UK, a rise of none per cent on the previous year.
This alarming figure, gathered from responses by 42 of the 45 UK police forces to a Freedom of Information request, represents a notable rise from the 33,886 offences documented in the previous year.
The government’s strategy, published in December, to tackle Violence Against Women and Girls (VAWG), stated an aim to “make it impossible for children in the UK to take, share or view a nude image” and said it was “working constructively with companies to make this a reality”.
But the NSPCC said this must be made mandatory, with the Government urged to take action against tech companies if they fail to embed existing technology on children’s phones that blocks nude images from being created, shared or viewed.
The charity said these “device‑level protections” should be embedded by default, meaning children are automatically protected and adult users could go through a process to opt out.
Such technology can block a nude image taken, sent or received on a device, and the NSPCC said that because the image is never created or sent in the first place, there is nothing to encrypt and that this method can stop abuse at source.
The NSPCC said that of the 10,811 crimes where police forces recorded the platform used by perpetrators, 43 per cent or a total of 4,615 took place on Snapchat.
Overall, Meta platforms still accounted for almost a quarter of all offences (24 per cent), with 8 per cent on Instagram, 7 per cent on WhatsApp, 5 per cent on Facebook and 4 per cent on Messenger, the charity said.
But the NSPCC said because of end-to-end-encryption, the true scale of abuse children are experiencing online remains “hidden”.
NSPCC chief executive Chris Sherwood said: “Children across the UK are being completely failed by tech companies that should be protecting them online. We cannot keep letting them off the hook when they can do more to prevent this from happening in the first place.”
He added: “Technology already exists that could be deployed today to stop children from taking, sharing or receiving nude images. So, the real question is: what’s stopping them? If they continue to drag their feet, government must show their might by stepping in and compelling them to act.”
Kerry Smith, chief executive of the Internet Watch Foundation, said the data “should be yet another wake-up call”, adding: “Mandatory introduction of on-device protections will protect children from unsolicited nude imagery, and from being coerced into sending sexually explicit material.
“We must see these measures applied across the board.”
Safeguarding minister, Jess Phillips, said the data uncovered by the NSPCC was “nothing short of deeply shocking”.
She added: “Predators cannot continue like this – unstopped and unchecked. We plan to stop them.
“We have committed to making it impossible for children in the UK to take, share or view nude images, and have already announced a ban on so‑called ‘nudification’ apps to stop abusive images being created and spread in the first place.
“We will not hesitate to go further until our children are safe from sexual abuse online.”
Earlier this year it was announced that nudification apps would be criminalised as part of the Crime and Policing Bill, which is currently going through Parliament.
The data comes after two watchdogs last week warned big tech it must do more to protect young people online.
Communications regulator Ofcom wrote to Facebook, Instagram, Snapchat and others, giving them until the end of April to explain what actions they are taking on age checks and grooming protections.
Alongside Ofcom’s demands, the Information Commissioner’s Office (ICO) also wrote to Snapchat, Facebook, Instagram, and others asking them to set out how their age assurance policies keep children safe.
The NSPCC said the Police Service of Northern Ireland and Police Scotland were included in the data but forces missing were Gloucestershire, Hampshire and Thames Valley.

