Tech firms should provide phones that prevent children from being able to take nude photographs of themselves, the Metropolitan Police’s deputy commissioner said.
Matt Jukes, the former head of UK counterterrorism policing, has called for the tech industry to take control of children’s online safety, rather than leaving it in the hands of individual platforms.
He believes phone firms can go “further and faster” to protect children from the increasing amount of nude images of young people online, which he said were often taken on their phones.
Police recorded 35,388 indecent images of children in 2024, accounting for 29 per cent of all child sexual abuse and exploitation offences.
“It’s a sad reality that increasing amounts of indecent imagery of children online result from images taken on their own devices, often through exploitation,” Mr Jukes told The Times.
“I think every parent wants to feel they’re doing the best by their children and providing phones as a lifeline and a means to keep safe.”
He added: “There’s real potential in having devices that give parents and carers greater confidence to control the images taken, and reduce the amount of self-generated imagery entering messaging apps and online spaces … we should be looking at every level of opportunity to disrupt all of these harms.”
Mr Jukes told The Times that the Met Police was considering using AI to improve the process of grading child sexual abuse images, which are labelled as category A, B, or C, with A being the most serious kind.
He said it could “significantly reduce the amount of time officers and staff are exposed to the most distressing material, while ensuring that human judgement, strong oversight and victim care remain at the heart of every investigation”.
Data published earlier this month found that a record number of British children reported themselves as victims of sextortion, where they were blackmailed over sexual images of themselves online.
The Report Remove helpline, which allows children to self-report nude or sexual imagery of themselves that is circulating on the internet, received 1,894 notifications from children reporting sexual imagery of themselves last year. The charity confirmed the presence of child sexual abuse imagery in 1,175 of these reports, an 83 per cent rise compared to 2024.
Hannah Swirsky, the head of policy at the Internet Watch Foundation (IWF), said: “Survivors have spoken about the fear they have that this imagery will continue to circulate, and the lack of control they have once that imagery is out there.”
Mumsnet, an online forum which has advocated for improved online safety features for children, launched its own smartphone ‘The Other Phone’ last year.
The group said it wanted to take control of their children’s safety on the internet and protect them from harmful and dangerous content.
The phone has built-in software which monitors and filters what children see and scans for harmful content or inappropriate images.
Justine Roberts, the group’s founder, said: “Parents feel trapped between wanting their kids to stay connected and knowing most devices are designed with profit, not child safety, in mind.”

