Man’s Shocking Medical Mishap After Trusting AI for Life-Saving Advice Raises Urgent Questions
Ever wondered if asking an AI chatbot about your health could land you more trouble than relief? Well, turns out the internet’s golden rule of “don’t Google your symptoms” might just have a high-tech cousin: “don’t trust ChatGPT for medical advice.” Take the case of a 60-year-old fellow who, aiming to cut down on salt, replaced good old sodium chloride with bromide on ChatGPT’s recommendation—only to find himself battling paranoia, hallucinations, and a hospital stay. It’s a wild reminder that while AI can seem like a genius sidekick, it’s not quite the doctor we wish it was. So before you swap your table salt for anything your chatbot suggests, maybe give your physician a call instead. LEARN MORE
They say the worst thing you can do is Google your symptoms when you’re unwell, but turning to ChatGPT for medical advice could also have some pretty dire consequences.
A 60-year-old man discovered this for himself when he found himself in hospital after he poisoned himself on the AI chatbot’s advice.
The man, whose case is detailed in the American College of Physicians Journals, was concerned about the amount of salt in his diet and the negative impact it could be having on his health, so he decided to consult ChatGPT about cutting out sodium chloride.
The AI bot suggested he start consuming bromide instead, which can be found in small amounts in seawater and in certain minerals. It was previously used as ingredient in a number of pharmaceutical products, however, it has since been discovered to be toxic to humans in larger quantities.

The man replaced sodium chloride with sodium bromide upon the advise from ChatGPT (Getty Stock Images)
Unaware of this, the man began replacing salt with bromide he ordered from the internet and after about three months, he started experiencing severe paranoia and hallucinations, which led to him being hospitalised.
The man, who had no previous history of poor mental or physical health, initially suspected his neighbour of poisoning him, however, after being treated with fluids and electrolytes he shared other symptoms, including new acne and cherry angiomas, leading doctors to conclude he was experiencing bromism.
Bromism, which is caused by excessive exposure to bromine, can cause neurological symptoms like seizures, tremors, confusion and even comas. It can also cause anxiety, depression, psychosis, fatigue and anorexia, among other symptoms.
“Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet,” the case report explained.
He replaced table salt ‘sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning’.

ChatGPT’s Terms of Use states information is not always correct (Cheng Xin/Getty Images)
After three weeks in hospital, the man was discharged and the author of the case has warned others not to make the same mistake of taking medical information from AI sources such as ChatGPT.
They wrote: “It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.”
Meanwhile, OpenAI, the developer behind ChatGPT says the Terms of Use say information ‘may not always be accurate’.
The terms state: “You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice.”
The company’s Service Terms also say: “Our Services are not intended for use in the diagnosis or treatment of any health condition.”
LADbible has contacted OpenAI for further comment.
Post Comment