Health
Man Hospitalized After Following AI Advice on Salt Substitute

A 60-year-old man ended up in the hospital after seeking advice from ChatGPT on how to replace table salt. The case, detailed in a report published in the *Annals of Internal Medicine*, highlights the potential dangers of relying on artificial intelligence for medical guidance. The man experienced hallucinations and paranoia, leading to an emergency room visit at an unspecified hospital.
Upon his admission, the patient expressed concerns that his neighbor might be poisoning him. Within the first 24 hours, his condition worsened, resulting in more intense paranoia and visual and auditory hallucinations, prompting psychiatric care. After stabilizing his symptoms, the man disclosed that he had been exploring the health risks associated with sodium chloride, commonly known as table salt.
Instead of merely reducing sodium intake, he sought a complete elimination of chloride from his diet. He turned to ChatGPT for advice on suitable substitutes. The AI recommended sodium bromide, a chemical typically used in water treatment and film photography, but not intended for human consumption.
For the next three months, the man replaced sodium chloride with sodium bromide, which he acquired online based on his interactions with the AI. The case report notes that while ChatGPT indicated that chloride could be substituted with bromide, it failed to provide a health warning or inquire about the man’s intentions, which a medical professional would typically do.
Bromide ingestion can lead to a condition known as bromism, characterized by symptoms such as hallucinations, fatigue, and insomnia. The man’s blood tests revealed bromide levels at an alarming 1,700 mg/L, far exceeding the normal range of 0.9 to 7.3 mg/L. This condition was rarely reported after the U.S. Food and Drug Administration (FDA) restricted bromide use in the 1980s, but has resurfaced with the sale of unregulated dietary supplements containing bromide.
Risks of AI in Medical Advice
The authors of the case report, doctors from the University of Washington, including Audrey Eichenberger, Stephen Thielke, and Adam Van Buskirk, caution against using AI tools for medical queries. They emphasize that AI can perpetuate misinformation and lacks the contextual understanding that healthcare professionals possess.
“AI systems can generate scientific inaccuracies and lack the ability to critically discuss results,” the authors stated. They also noted that while AI can bridge the gap between scientific knowledge and the public, it poses risks when users receive information out of context.
In response to concerns raised by this incident, OpenAI, the organization behind ChatGPT, has reiterated that their AI is not meant for health advice. The company has implemented measures to guide users towards consulting healthcare professionals. “Our terms state that ChatGPT is not intended for use in the treatment of any health condition,” OpenAI stated. They also mentioned ongoing efforts to enhance the safety of health-related queries.
The case serves as a reminder of the potential pitfalls associated with using AI for health information. As reliance on such technology grows, it becomes increasingly important for healthcare providers to be aware of where their patients are obtaining health-related advice.
The man received treatment in the hospital for three weeks and was stable during a follow-up visit two weeks after discharge. While cases of bromism may remain rare, the availability of bromide through online sources highlights the need for caution and informed decision-making regarding health and dietary changes.
-
Science1 week ago
Microsoft Confirms U.S. Law Overrules Canadian Data Sovereignty
-
Technology1 week ago
Google Pixel 10 Pro Fold Specs Unveiled Ahead of Launch
-
Technology1 week ago
World of Warcraft Players Buzz Over 19-Quest Bee Challenge
-
Science1 week ago
Xi Labs Innovates with New AI Operating System Set for 2025 Launch
-
Science5 days ago
China’s Wukong Spacesuit Sets New Standard for AI in Space
-
Health5 days ago
Rideau LRT Station Closed Following Fatal Cardiac Incident
-
Technology1 week ago
Humanoid Robots Compete in Hilarious Debut Games in Beijing
-
Science1 week ago
Infrastructure Overhaul Drives AI Integration at JPMorgan Chase
-
Top Stories1 week ago
Surrey Ends Horse Racing at Fraser Downs for Major Redevelopment
-
Lifestyle5 days ago
Vancouver’s Mini Mini Market Showcases Young Creatives
-
Technology1 week ago
New IDR01 Smart Ring Offers Advanced Sports Tracking for $169
-
Science1 week ago
New Precision Approach to Treating Depression Tailors Care to Patients
-
Technology1 week ago
Future Entertainment Launches DDoD with Gameplay Trailer Showcase
-
Technology1 week ago
Global Launch of Ragnarok M: Classic Set for September 3, 2025
-
Technology1 week ago
Innovative 140W GaN Travel Adapter Combines Power and Convenience
-
Business1 week ago
New Estimates Reveal ChatGPT-5 Energy Use Could Soar
-
Health5 days ago
B.C. Review Urges Changes in Rare-Disease Drug Funding System
-
Technology5 days ago
Dragon Ball: Sparking! Zero Launching on Switch and Switch 2 This November
-
Health1 week ago
Giant Boba and Unique Treats Take Center Stage at Ottawa’s Newest Bubble Tea Shop
-
Business5 days ago
Canadian Stock Index Rises Slightly Amid Mixed U.S. Markets
-
Business1 week ago
Ukraine Strikes Lukoil Refinery, Halting Operations Amid Conflict
-
Lifestyle1 week ago
Eleven Madison Park to Reinstate Meat After Vegan Experiment
-
Business1 week ago
Simons Plans Toronto Expansion as Retail Sector Shows Resilience
-
Science1 week ago
New Study Reveals Surprising Impact of Gratitude on Helping Behaviors