UK TimesUK Times
  • Home
  • News
  • TV & Showbiz
  • Money
  • Health
  • Science
  • Sports
  • Travel
  • More
    • Web Stories
    • Trending
    • Press Release
What's Hot

A1 northbound access at a minor junction between B6325 and B1164 | Northbound | Road Works

6 September 2025

Women’s Super League: Warrington unable to finish season and relegated | Manchester News

6 September 2025

In post-Jan. 6 era, 600 officers train for riots as threats to lawmakers climb toward record high – UK Times

6 September 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
UK TimesUK Times
Subscribe
  • Home
  • News
  • TV & Showbiz
  • Money
  • Health
  • Science
  • Sports
  • Travel
  • More
    • Web Stories
    • Trending
    • Press Release
UK TimesUK Times
Home » NHS urges young people not to use ‘harmful’ AI chatbots as therapist – UK Times
News

NHS urges young people not to use ‘harmful’ AI chatbots as therapist – UK Times

By uk-times.com5 September 2025No Comments3 Mins Read
Facebook Twitter LinkedIn Telegram Pinterest Tumblr Reddit WhatsApp Email
Share
Facebook Twitter LinkedIn Pinterest Email

Sign up for our free Health Check email to receive exclusive analysis on the week in health

Get our free Health Check email

Get our free Health Check email

Health Check

The NHS has urged young people to stop using AI chatbots as a substitute for therapy, warning that they can provide “harmful and dangerous” mental health advice.

Millions are turning to artificial intelligence for support with anxiety, depression, and other mental health concerns, often using chatbots daily to request coping strategies or seek emotional reassurance.

But NHS leaders have said the rise in so-called “AI therapy” is a worrying trend, particularly among teenagers and young adults, with experts warning that these tools are not equipped to handle serious mental health conditions and could worsen symptoms.

“We are hearing some alarming reports of AI chatbots giving potentially harmful and dangerous advice to people seeking mental health treatment, particularly among teens and younger adults,” said Claire Murdoch, NHS England’s national mental health director, told The Times.

She said AI platforms should “not be relied upon” for sound mental health advice and “should never replace trusted sources” of information from registered therapists.

“The information provided by these chatbots can be hit and miss, with AI known to make mistakes,” she added, noting that they cannot take into account body language or visual cues to further understand the patient’s state. urgh

She urged people not to “roll the dice” with what support they seek for their mental health, saying patients should only use “digital tools that are proven to be clinically safe and effective”.

In the wake of the coronavirus pandemic, demand for therapy is high, particularly among young people. More than 1.2 million people in England began NHS therapy for depression and anxiety last year alone.

But as slots with therapists can prove difficult to secure, researchers found that there are more than 17 million TikTok posts about using Chat GPT as a substitute.

A YouGov poll also found that nearly a third (31 per cent) of 18 to 24-year-olds in the UK said they would be comfortable discussing mental health issues with an AI chatbot instead of a human therapist.

But users have reported that AI responses often validate negative or delusional thoughts, reinforcing them instead of offering constructive guidance.

One of the major concerns among clinicians is that chatbots are unable to challenge distorted thinking or harmful behaviours in the way a trained therapist would.

Experts warn that replacing real-life human interaction with screen time may further isolate people and deepen feelings of loneliness, a known risk factor for worsening mental health.

NHS England is continuing to develop its own AI and digital tools, such as Beating the Blues, an online cognitive behavioural therapy programme, but they point out that these are evidence-based and clinically approved, unlike ChatGPT.

In August, OpenAI CEO Sam Altman acknowledged the issue, saying: “If a user is in a mentally fragile state and prone to delusion, we do not want the AI to reinforce that.” He also admitted the company was aware that some people were using the tool in “self-destructive ways”.

In article published on OpenAI’s website last month entitled ‘Helping people when they need it most’, the company said they were “continuing to improve how our models recognize and respond to signs of mental and emotional distress and connect people with care, guided by expert input”.

OpenAI has been approached by The Independent for comment.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email

Related News

A1 northbound access at a minor junction between B6325 and B1164 | Northbound | Road Works

6 September 2025

Women’s Super League: Warrington unable to finish season and relegated | Manchester News

6 September 2025

In post-Jan. 6 era, 600 officers train for riots as threats to lawmakers climb toward record high – UK Times

6 September 2025

A1(M) northbound between J34 and J35 | Northbound | Road Works

6 September 2025

F1 calendar: Full Grand Prix schedule for 2025 – UK Times

6 September 2025

A63 eastbound exit at a minor junction between A15 and A1166 | Eastbound | Road Works

6 September 2025
Top News

A1 northbound access at a minor junction between B6325 and B1164 | Northbound | Road Works

6 September 2025

Women’s Super League: Warrington unable to finish season and relegated | Manchester News

6 September 2025

In post-Jan. 6 era, 600 officers train for riots as threats to lawmakers climb toward record high – UK Times

6 September 2025

Subscribe to Updates

Get the latest UK news and updates directly to your inbox.

© 2025 UK Times. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us

Type above and press Enter to search. Press Esc to cancel.

Go to mobile version