A 60-year-old man landed in the hospital after asking ChatGPT how to remove sodium chloride from his diet.
As humans interact more with artificial intelligence, there continues to be stories of how a conversation with a chatbot could be dangerous, sometimes even deadly.
While part of the focus has been on mental health and concerns that chatbots are not equipped to handle these types of struggles, there are also implications for people’s physical health.
People often hear that you shouldn’t Google your symptoms, as medical advice should be given by a health professional, who knows your medical history and can actually examine you.
According to a new case report published in the American College of Physicians Journals on Tuesday, you should also be careful when considering asking a chatbot health questions.
The report looked at a man who developed bromism after asking ChatGPT for advice on his diet.
Bromism, or Bromide toxicity, was well-known in the early 1990s but is less common now. At the time, bromide salts were found in many over-the-counter medications to treat insomnia, hysteria and anxiety. Ingesting too much bromide can cause neuropsychiatric and dermatologic symptoms.
The man in this case report had no past psychiatric or medical history, but during the first 24 hours of his hospitalization, he expressed increased paranoia and auditory and visual hallucinations.
“He was noted to be very thirsty but paranoid about water he was offered,” the case report read.
The man was treated with fluids and electrolytes and became medically stable, allowing him to be admitted to the hospital’s inpatient psychiatry unit.
As his condition improved, he was able to share some symptoms he had noticed, including newly appeared facial acne and cherry angiomas, which further suggested he was experiencing bromism.
He also said he had been swapping sodium chloride, or table salt, for sodium bromide for three months after reading about the negative health effects of table salt.
“Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet,” the case report read.
He had replaced table salt with “sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning.”
The man spent three weeks in the hospital before he was well enough to be discharged.
“It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation,” the authors of the report warned.
Open AI, the developer of ChatGPT, admits in its Terms of Use the chatbot’s output “may not always be accurate.”
“You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice,” the Terms of Use say.
The company’s Service Terms also explicitly state: “Our Services are not intended for use in the diagnosis or treatment of any health condition.”