Close Menu
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Facebook X (Twitter) WhatsApp
Trending
  • Bachelorette’s Katie Thurston Slams Cody Menk’s Wedding Photo Comment
  • Nagasaki marks 80th A-bomb anniversary as survivors put hopes of nuke ban in the hands of youth
  • Science news this week: A 400-year trip to Alpha Centauri and the malevolent AI that may make us consider it
  • David Rose identified as officer killed in Atlanta shooting near CDC HQ
  • BitMar — find free movies and TV shows online
  • Chelsea: Joao Pedro holds advantage over Liam Delap as Estevao Willian and Jamie Gittens shine against Bayer Leverkusen | Football News
  • Malcolm-Jamal Warner’s Mom Pamela Warner on His Death
  • What we know about the Atlanta shooting near the CDC and Emory University
Get Your Free Email Account
Facebook X (Twitter) WhatsApp
Baynard Media
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Baynard Media
Home»Lifestyle»Man sought diet advice from ChatGPT and ended up with ‘bromide intoxication,’ which caused hallucinations and paranoia
Lifestyle

Man sought diet advice from ChatGPT and ended up with ‘bromide intoxication,’ which caused hallucinations and paranoia

EditorBy EditorAugust 8, 2025No Comments6 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

A man consulted ChatGPT prior to changing his diet. Three months later, after consistently sticking with that dietary change, he ended up in the emergency department with concerning new psychiatric symptoms, including paranoia and hallucinations.

It turned out that the 60-year-old had bromism, a syndrome brought about by chronic overexposure to the chemical compound bromide or its close cousin bromine. In this case, the man had been consuming sodium bromide that he had purchased online.

A report of the man’s case was published Tuesday (Aug. 5) in the journal Annals of Internal Medicine Clinical Cases.


You may like

Live Science contacted OpenAI, the developer of ChatGPT, about this case. A spokesperson directed the reporter to the company’s service terms, which state that its services are not intended for use in the diagnosis or treatment of any health condition, and their terms of use, which state, “You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice.” The spokesperson added that OpenAI’s safety teams aim to reduce the risk of using the company’s services and to train the products to prompt users to seek professional advice.

“A personal experiment”

In the 19th and 20th centuries, bromide was widely used in prescription and over-the-counter (OTC) drugs, including sedatives, anticonvulsants and sleep aids. Over time, though, it became clear that chronic exposure, such as through the abuse of these medicines, caused bromism.

Related: What is brominated vegetable oil, and why did the FDA ban it in food?

This “toxidrome” — a syndrome triggered by an accumulation of toxins — can cause neuropsychiatric symptoms, including psychosis, agitation, mania and delusions, as well as issues with memory, thinking and muscle coordination. Bromide can trigger these symptoms because, with long-term exposure, it builds up in the body and impairs the function of neurons.

Get the world’s most fascinating discoveries delivered straight to your inbox.

In the 1970s and 1980s, U.S. regulators removed several forms of bromide from OTC medicines, including sodium bromide. Bromism rates fell significantly thereafter, and the condition remains relatively rare today. However, occasional cases still occur, with some recent ones being tied to bromide-containing dietary supplements that people purchased online.

Prior to the man’s recent case, he’d been reading about the negative health effects of consuming too much table salt, also called sodium chloride. “He was surprised that he could only find literature related to reducing sodium from one’s diet,” as opposed to reducing chloride, the report noted. “Inspired by his history of studying nutrition in college, he decided to conduct a personal experiment to eliminate chloride from his diet.”

(Note that chloride is important for maintaining healthy blood volume and blood pressure, and health issues can emerge if chloride levels in the blood become too low or too high.)

The patient consulted ChatGPT — either ChatGPT 3.5 or 4.0, based on the timeline of the case. The report authors didn’t get access to the patient’s conversation log, so the exact wording that the large language model (LLM) generated is unknown. But the man reported that ChatGPT said chloride can be swapped for bromide, so he swapped all the sodium chloride in his diet with sodium bromide. The authors noted that this swap likely works in the context of using sodium bromide for cleaning, rather than dietary use.

In an attempt to simulate what might have happened with their patient, the man’s doctors tried asking ChatGPT 3.5 what chloride can be replaced with, and they also got a response that included bromide. The LLM did note that “context matters,” but it neither provided a specific health warning nor sought more context about why the question was being asked, “as we presume a medical professional would do,” the authors wrote.

Recovering from bromism

After three months of consuming sodium bromide instead of table salt, the man reported to the emergency department with concerns that his neighbor was poisoning him. His labs at the time showed a buildup of carbon dioxide in his blood, as well as a rise in alkalinity (the opposite of acidity).

He also appeared to have elevated levels of chloride in his blood but normal sodium levels. Upon further investigation, this turned out to be a case of “pseudohyperchloremia,” meaning the lab test for chloride gave a false result because other compounds in the blood — namely, large amounts of bromide — had interfered with the measurement. After consulting the medical literature and Poison Control, the man’s doctors determined the most likely diagnosis was bromism.

Related: ChatGPT is truly awful at diagnosing medical conditions

After being admitted for electrolyte monitoring and repletion, the man said he was very thirsty but was paranoid about the water he was offered. After a full day in the hospital, his paranoia intensified and he began experiencing hallucinations. He then tried to escape the hospital, which resulted in an involuntary psychiatric hold, during which he started receiving an antipsychotic.

The man’s vitals stabilized after he was given fluids and electrolytes, and as his mental state improved on the antipsychotic, he was able to inform the doctors about his use of ChatGPT. He also noted additional symptoms he’d noticed recently, such as facial acne and small red growths on his skin, which could be a hypersensitivity reaction to the bromide. He also noted insomnia, fatigue, muscle coordination issues and excessive thirst, “further suggesting bromism,” his doctors wrote.

He was tapered off the antipsychotic medication over the course of three weeks and then discharged from the hospital. He remained stable at a check-in two weeks later.

“While it is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information,” the report authors concluded. “It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride.”

They emphasized that, “as the use of AI tools increases, providers will need to consider this when screening for where their patients are consuming health information.”

Adding to the concerns raised by the case report, a different group of scientists recently tested six LLMs, including ChatGPT, by having the models interpret clinical notes written by doctors. They found that LLMs are “highly susceptible to adversarial hallucination attacks,” meaning they often generate “false clinical details that pose risks when used without safeguards.” Applying engineering fixes can reduce the rate of errors but does not eliminate them, the researchers found. This highlights another way in which LLMs could introduce risks into medical decision-making.

This article is for informational purposes only and is not meant to offer medical or dietary advice.

Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleDashcam footage shows Terrell Suggs brandishing gun at Starbucks
Next Article The stakes of a potential Trump-Putin summit: From the Politics Desk
Editor
  • Website

Related Posts

Lifestyle

Science news this week: A 400-year trip to Alpha Centauri and the malevolent AI that may make us consider it

August 9, 2025
Lifestyle

Exotic ‘blazar’ is part of most extreme double black hole system ever found, crooked jet suggests

August 8, 2025
Lifestyle

NASA’s Hubble telescope reveals most detailed photos of interstellar visitor 3I/ATLAS to date

August 8, 2025
Add A Comment

Comments are closed.

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Recent Posts
  • Bachelorette’s Katie Thurston Slams Cody Menk’s Wedding Photo Comment
  • Nagasaki marks 80th A-bomb anniversary as survivors put hopes of nuke ban in the hands of youth
  • Science news this week: A 400-year trip to Alpha Centauri and the malevolent AI that may make us consider it
  • David Rose identified as officer killed in Atlanta shooting near CDC HQ
  • BitMar — find free movies and TV shows online
calendar
August 2025
M T W T F S S
 123
45678910
11121314151617
18192021222324
25262728293031
« Jul    
Recent Posts
  • Bachelorette’s Katie Thurston Slams Cody Menk’s Wedding Photo Comment
  • Nagasaki marks 80th A-bomb anniversary as survivors put hopes of nuke ban in the hands of youth
  • Science news this week: A 400-year trip to Alpha Centauri and the malevolent AI that may make us consider it
About

Welcome to Baynard Media, your trusted source for a diverse range of news and insights. We are committed to delivering timely, reliable, and thought-provoking content that keeps you informed
and inspired

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Facebook X (Twitter) Pinterest WhatsApp
  • Contact Us
  • About Us
  • Privacy Policy
  • Disclaimer
  • UNSUBSCRIBE
© 2025 copyrights reserved

Type above and press Enter to search. Press Esc to cancel.