UK TimesUK Times
  • Home
  • News
  • TV & Showbiz
  • Money
  • Health
  • Science
  • Sports
  • Travel
  • More
    • Web Stories
    • Trending
    • Press Release
What's Hot

A259 westbound between A268 and B2089 | Westbound | Congestion

2 August 2025

When Scotland was the world’s UFO hot spot | UK News

2 August 2025

122-year-old message in bottle found hidden in wall of Tasmanian lighthouse – UK Times

2 August 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
UK TimesUK Times
Subscribe
  • Home
  • News
  • TV & Showbiz
  • Money
  • Health
  • Science
  • Sports
  • Travel
  • More
    • Web Stories
    • Trending
    • Press Release
UK TimesUK Times
Home » Keeping children safe online changes to the Online Safety Act explained
Money

Keeping children safe online changes to the Online Safety Act explained

By uk-times.com1 August 2025No Comments5 Mins Read
Facebook Twitter LinkedIn Telegram Pinterest Tumblr Reddit WhatsApp Email
Share
Facebook Twitter LinkedIn Pinterest Email

Keeping children safe

The way children experience the internet has fundamentally changed, as new laws under the Online Safety Act have come into force to protect under-18s from harmful online content they shouldn’t ever be seeing. This includes content relating to

  • pornography
  • self-harm
  • suicide
  • eating disorder content

Ofcom figures show that children as young as 8 have accessed pornography online, while 16% of teenagers have seen material that stigmatises body types or promotes disordered eating in the last 4 weeks.   

To protect the next generation from the devastating impact of this content, people now have to prove their age to access pornography or this other harmful material on social media and other sites.    

Platforms are required to use secure methods like facial scans, photo ID and credit cards checks to verify the age of their users. This means it will be much harder for under-18s to accidentally or intentionally access harmful content. 

It’s clear in Ofcom’s codes that we expect platforms to ensure that strangers have no way of messaging children. This includes preventing children from receiving DMs from strangers and children should not be recommended any accounts to connect with.  

Data privacy

While people might see more steps to prove their age when signing up or browsing age-restricted content, they won’t be compromising their privacy.    

The measures platforms have to put in place must confirm your age without collecting or storing personal data, unless absolutely necessary. For example, facial estimation tools can estimate your age from an image without saving that image or identifying who you are. Many third-party solutions have the ability to provide platforms with an answer to the question of whether a user is over 18, without sharing any additional data relating to the user’s identity. 

 The government and the regulator, Ofcom, are clear that platforms must use safe, proportionate and secure methods, and any company that misuses personal data or doesn’t protect users could face heavy penalties.

Services must also comply with the UK’s data protection laws. The Information Commissioner’s Office (ICO) has set out the main data protection principles that services must take into account in the context of age assurance, including minimising personal data which is collected for these purposes.  

Virtual Private Networks

While Virtual Private Networks (VPNs) are legal in the UK, according to this law, platforms have a clear responsibility to prevent children from bypassing safety protections. This includes blocking content that promotes VPNs or other workarounds specifically aimed at young users.   

This means that where platforms deliberately target UK children and promote VPN use, they could face enforcement action, including significant financial penalties.  

The Age Verification Providers Association (AVPA) reports that there has been an additional 5 million age checks on a daily basis as UK-based internet users seek to access sites that are age-restricted.

Legal adult content

Online Safety laws do not ban any legal adult content. Instead, the laws protect children from viewing material that causes real harm in the offline world, devastating young lives and families.    

Under the Act, platforms should not arbitrarily block or remove content and instead must take a risk-based, proportionate approach to child safety duties.

Technology Secretary Peter Kyle said

This marks the most significant step forward in child safety since the internet was created.

The reality is that most children aren’t actively seeking out harmful, dangerous, or pornographic content – unfortunately it finds them. That’s why we’ve taken decisive action.

Age verification keeps children safe. Rather than looking for ways around it, let’s help make the internet a safer, more positive space for children – and a better experience for everyone. That’s something we should all aspire to.

Support for the Online Safety Act

NSPCC Chief Executive, Chris Sherwood 

We regularly hear from children who have suffered sexual and emotional abuse online, or who have been exposed to harmful and dangerous content.

These experiences can have devastating impacts both immediately and long into the future. While the Online Safety Act can’t erase this pain and anger, it can be a vehicle for significant and lasting change.

Thanks to this piece of ground-breaking regulation, algorithms are now being redesigned. Age checks are now in place. Harmful material that promotes eating disorders and suicide should no longer proliferate on social media platforms.

This will – without a doubt – create safer, more age-appropriate online experiences for young users across the UK.

Barnardo’s CEO, Lynne Perry

These new protections are an important stepping stone towards making sure that children are safer online. They must be robustly enforced.

Internet Matters

Today marks an important milestone for children’s online safety […] towards ensuring that online services are designed with children’s safety in mind – from limiting children’s exposure to harmful content to creating age-appropriate experiences. 

This milestone matters because the risks children face online remain high. Our latest Internet Matters Pulse shows that 3 in 4 children aged 9-17 experience harm online, from exposure to violent content to unwanted contact from strangers. With the Codes now enforceable, Ofcom must hold platforms accountable for meeting their obligations under the law.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email

Related News

New wave of mums to benefit from 24/7 breastfeeding support

2 August 2025

Every Russian strike is a commitment not to peace, but an attempt to destroy life and liberty in Ukraine UK statement at the UN Security Council

1 August 2025

4.2% pay rise for police officers across England and Wales

1 August 2025

OSCE Helsinki +50 Conference Closing Session, UK statement

1 August 2025

InFocus Updates from the Government Property Agency, August 2025

1 August 2025

£150 million in farming grants successfully allocated

1 August 2025
Top News

A259 westbound between A268 and B2089 | Westbound | Congestion

2 August 2025

When Scotland was the world’s UFO hot spot | UK News

2 August 2025

122-year-old message in bottle found hidden in wall of Tasmanian lighthouse – UK Times

2 August 2025

Subscribe to Updates

Get the latest UK news and updates directly to your inbox.

© 2025 UK Times. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us

Type above and press Enter to search. Press Esc to cancel.

Go to mobile version