U.S. Man Seeks Dietary Advice from ChatGPT, Gets 'Misdiagnosed' With Bromine Poisoning

August 11 news, according to the U.S. media LIVE SCIENCE 9 reported that a 60-year-old man in the adjustment of the diet before having to the ChatGPT counseling, with the result that after three months of strict adherence to this new dietary program, he was unable to get a new diet because of the appearance ofParanoia, hallucinations and other psychiatric symptomsBeing admitted to the emergency room.

U.S. Man Seeks Dietary Advice from ChatGPT, Gets 'Misdiagnosed' With Bromine Poisoning

Doctors eventually diagnosed him with bromotoxicosis -- a disease caused by chronicOverexposure to bromide or bromineRare disease caused by. The man bought sodium bromide from the Internet andCompletely replace your daily salt with it.

The case was published in Annals of Internal Medicine Clinical Cases on August 5. A spokesperson for OpenAI said that the company's terms of service clearly state that the servicesNot intended to diagnose or treat any health problem; the Terms of Use also state that "the export of our services should not be considered to beSubstitution of sole factual basis or professional advice". He added that OpenAI's security team works to mitigate the risks of use and trains the product to direct users to seek professional advice.

1AI learned from reports that bromide was commonly used in prescription and over-the-counter medications in the 19th and 20th centuries as a sedative, anti-seizure drug, and sleep aid. However, prolonged exposure (e.g., drug abuse) can lead to bromine toxicity. Patients can developPsychosis, mania, delusions, memory disorders and muscle coordinationproblem because bromide can build up in the body and impair nerve function.

By the 1970s and 1980s, many bromides, including sodium bromide, had beenRemoval from over-the-counter medicationsThe number of cases has decreased significantly. In the present time, occasional cases are still caused by brominated dietary supplements sold on the Internet.

The man had prior to the accidentFind out the dangers of excessive salt (sodium chloride) intakeBut only foundInformation on reducing sodium intake,Few chlorine reduction recommendations. Influenced by his nutritional learning experience during his college years, he decided to do his own experiments with dietaryCompletely de-chlorinated.

He consulted ChatGPT (presumably version 3.5 or 4.0). The man stated that ChatGPT recommended bromide as a replacement for chloride, so he substituted saltReplace all with sodium bromide. Doctors point out that this alternativeSupposed to be used for cleaning purposes, not for consumption.

The doctor later simulated questions ChatGPT 3.5 alsoReceived a similar responseThe model refers to the "need for contextualization" but does not issue clear health warnings or ask for reasons as doctors do. The model refers to the "need for contextualization" without issuing a clear health warning or asking why, as doctors do.

Three months later, the man was arrested forSuspected neighbor poisoningHe went to the emergency room. Tests showed elevated carbon dioxide, alkaline blood reaction and "pseudohyperchlorhydria". Bromide interfered with the results. Combined with the opinion of the Poison Control Center, the doctor concluded that it was bromine poisoning.

During his hospitalization, he was extremely thirsty butSuspicion that the drinking water is toxic, who one day later became more paranoid and hallucinatory, attempted to flee the hospital, was placed on mandatory psychiatric hold and started on antipsychotic medication. After rehydration and electrolyte replacement, his vital signs stabilized. As the medication controlled his symptoms, he disclosedExperience with ChatGPTand mentioned recent symptoms such as acne, skin rashes, insomnia, fatigue, and muscle coordination disorders, further corroborating bromine poisoning.

He was discharged from the hospital three weeks later off antipsychotics and was stable at his two-week review.

The report's authors say that while AI can build bridges between the scientific community and the public, it can also disseminate information out of context. "It is almost unlikely that any doctor would recommend sodium bromide as an alternative to sodium chloride."

In another study, scientists tested six large language models, including ChatGPT, to interpret clinical notes and found that the models "Highly vulnerable to confrontational hallucinations", which generates theFalse clinical details. Even with engineering improvements, risks cannot be completely eliminated, setting off alarm bells for the use of AI in medical decision-making.

statement:The content of the source of public various media platforms, if the inclusion of the content violates your rights and interests, please contact the mailbox, this site will be the first time to deal with.
Information

Microsoft Copilot (Win10/11) supports GPT - 5 Smart Mode with looser restrictions than ChatGPT

2025-8-11 11:06:53

Information

Musk's most powerful AI model, Grok 4, is now free and open to non-subscribers for a limited number of uses per day

2025-8-11 11:12:19

Search