A man was hospitalized for weeks and suffered from hallucinations after poisoning himself based on dietary advice from ChatGPT.

A case study published Aug. 5 in the Annals of Internal Medicine, an academic journal, says the 60-year-old man decided he wanted to eliminate salt from his diet. To do so, he asked ChatGPT for an alternative to salt, or sodium chloride, to which the AI chatbot suggested sodium bromide, a compound historically used in pharmaceuticals and manufacturing.

Though the journal noted that doctors were unable to review the original AI chat logs and that the bot likely suggested the substitution for another purpose, such as cleaning, the man purchased sodium bromide and used it in place of table salt for three months.

As a result, he ended up in the hospital emergency room with paranoid delusions, despite having no history of mental health problems. Convinced that his neighbor was poisoning him, the man was reluctant to even accept water from the hospital, despite reporting extreme thirst. He continued to experience increased paranoia, as well as auditory and visual hallucinations, eventually landing him an involuntary psychiatric hold after he tried to escape during treatment.

What was happening to the man?

Doctors determined that the man was suffering from bromide toxicity, or bromism, which can result in neurological and psychiatric symptoms, as well as acne and cherry angiomas (bumps on the skin), fatigue, insomnia, subtle ataxia (clumsiness) and polydipsia (excessive thirst).

Other symptoms of bromism can include nausea and vomiting, diarrhea, tremors or seizures, drowsiness, headache, weakness, weight loss, kidney damage, respiratory failure and coma, according to iCliniq.

Bromism was once far more common because of bromide salts in everyday products. In the early 20th century, it was used in over-the-counter medications, often resulting in neuropsychiatric and dermatological symptoms, according to the study’s authors. Incidents of such poisoning saw a sharp decline when the Food and Drug Administration phased out the use of bromides in pharmaceuticals in the mid-1970s and late 1980s.

The man was treated at the hospital for three weeks, over which time his symptoms progressively improved.

We tried to reach the company for a statement and got no reply yet….

The company provided Fox News Digital with a statement saying: “Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice. We have safety teams working on reducing risks and have trained our AI systems to encourage people to seek professional guidance.”

Leave a Reply

Discover more from Wayarc Daily

Subscribe now to keep reading and get access to the full archive.

Continue reading