UK TimesUK Times
  • Home
  • News
  • TV & Showbiz
  • Money
  • Health
  • Science
  • Sports
  • Travel
  • More
    • Web Stories
    • Trending
    • Press Release
What's Hot

M5 J8 northbound exit | Northbound | Road Works

8 August 2025

‘Minister for hypocrisy’ and ‘Pill for weight loss on NHS’ | UK News

8 August 2025

Trump administration reverses course and grants some visas to Venezuelan teams to compete in Little League World Series – UK Times

8 August 2025
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
UK TimesUK Times
Subscribe
  • Home
  • News
  • TV & Showbiz
  • Money
  • Health
  • Science
  • Sports
  • Travel
  • More
    • Web Stories
    • Trending
    • Press Release
UK TimesUK Times
Home » A man asked ChatGPT how to remove sodium chloride from his diet. It landed him in the hospital – UK Times
News

A man asked ChatGPT how to remove sodium chloride from his diet. It landed him in the hospital – UK Times

By uk-times.com7 August 2025No Comments3 Mins Read
Facebook Twitter LinkedIn Telegram Pinterest Tumblr Reddit WhatsApp Email
Share
Facebook Twitter LinkedIn Pinterest Email

Sign up for our free Health Check email to receive exclusive analysis on the week in health

Get our free Health Check email

Get our free Health Check email

Health Check

A 60-year-old man landed in the hospital after asking ChatGPT how to remove sodium chloride from his diet.

As humans interact more with artificial intelligence, there continues to be stories of how a conversation with a chatbot could be dangerous, sometimes even deadly.

While part of the focus has been on mental health and concerns that chatbots are not equipped to handle these types of struggles, there are also implications for people’s physical health.

People often hear that you shouldn’t Google your symptoms, as medical advice should be given by a health professional, who knows your medical history and can actually examine you.

According to a new case report published in the American College of Physicians Journals on Tuesday, you should also be careful when considering asking a chatbot health questions.

A 60-year-old man landed in the hospital after asking ChatGPT how to remove sodium chloride from his diet

A 60-year-old man landed in the hospital after asking ChatGPT how to remove sodium chloride from his diet (Kirill Kudryavtsev/AFP via Getty Images)

The report looked at a man who developed bromism after asking ChatGPT for advice on his diet.

Bromism, or Bromide toxicity, was well-known in the early 1990s but is less common now. At the time, bromide salts were found in many over-the-counter medications to treat insomnia, hysteria and anxiety. Ingesting too much bromide can cause neuropsychiatric and dermatologic symptoms.

The man in this case report had no past psychiatric or medical history, but during the first 24 hours of his hospitalization, he expressed increased paranoia and auditory and visual hallucinations.

“He was noted to be very thirsty but paranoid about water he was offered,” the case report read.

The man was treated with fluids and electrolytes and became medically stable, allowing him to be admitted to the hospital’s inpatient psychiatry unit.

As his condition improved, he was able to share some symptoms he had noticed, including newly appeared facial acne and cherry angiomas, which further suggested he was experiencing bromism.

He also said he had been swapping sodium chloride, or table salt, for sodium bromide for three months after reading about the negative health effects of table salt.

“Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet,” the case report read.

The man had been swapping sodium chloride, or table salt, for sodium bromide for three months

The man had been swapping sodium chloride, or table salt, for sodium bromide for three months (PA Archive)

He had replaced table salt with “sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning.”

The man spent three weeks in the hospital before he was well enough to be discharged.

“It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation,” the authors of the report warned.

Open AI, the developer of ChatGPT, admits in its Terms of Use the chatbot’s output “may not always be accurate.”

“You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice,” the Terms of Use say.

The company’s Service Terms also explicitly state: “Our Services are not intended for use in the diagnosis or treatment of any health condition.”

Share. Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email

Related News

M5 J8 northbound exit | Northbound | Road Works

8 August 2025

‘Minister for hypocrisy’ and ‘Pill for weight loss on NHS’ | UK News

8 August 2025

Trump administration reverses course and grants some visas to Venezuelan teams to compete in Little League World Series – UK Times

8 August 2025

M50 J1 westbound exit | Westbound | Road Works

8 August 2025

M60 clockwise within J12 before M602 J1 access | Clockwise | Road Works

8 August 2025

‘I’ve been surviving on handouts’: Chaos as 120,000 UK drivers left stranded by ‘do not drive’ cars – UK Times

8 August 2025
Top News

M5 J8 northbound exit | Northbound | Road Works

8 August 2025

‘Minister for hypocrisy’ and ‘Pill for weight loss on NHS’ | UK News

8 August 2025

Trump administration reverses course and grants some visas to Venezuelan teams to compete in Little League World Series – UK Times

8 August 2025

Subscribe to Updates

Get the latest UK news and updates directly to your inbox.

© 2025 UK Times. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact Us

Type above and press Enter to search. Press Esc to cancel.

Go to mobile version