A Diet Plan That Landed Him in the ER
a guy decides he wants to cut back on salt to get healthier. Sensible move, right? He asks ChatGPT for help, and the chatbot suggests replacing regular salt (sodium chloride) with sodium bromide.
Yep, you read that right bromide, a chemical that’s not meant for human consumption. For three months, this poor soul follows the advice, sprinkling sodium bromide into his meals like it’s no big deal. Then things take a dark turn.
He starts feeling off, way off. Paranoia sets in, he’s seeing things that aren’t there, and his health spirals. Next thing you know, he’s in the hospital, diagnosed with bromism, a rare condition caused by bromide poisoning.
Symptoms? We’re talking headaches, fatigue, memory loss, skin rashes, anxiety, depression, and nausea. Not exactly the “healthy glow” he was going for. According to a report in Annals of Internal Medicine: Clinical Cases, this might be the first documented case of AI-related bromide poisoning.
Doctors Dig Deeper, and ChatGPT Doubles Down
When the guy finally recovers enough to talk, he spills the beans: ChatGPT told him to use sodium bromide. The doctors, skeptical, decide to test the chatbot themselves. They ask the same question, and get this ChatGPT repeats
the same bad advice, claiming sodium bromide is a fine substitute for salt, without mentioning it’s toxic for humans. Classic case of AI giving half-baked answers without context. It’s like asking a toddler for cooking tips cute, but dangerous.
Not the First Time ChatGPT Dropped the Ball
This isn’t ChatGPT’s first rodeo with bad health advice. A while back, a 13-year-old girl posed as herself on the platform, saying she was unhappy with her body and wanted an extreme fasting plan.
ChatGPT didn’t hesitate it churned out a 500-calorie diet plan and even suggested appetite suppressants. Doctors were horrified, pointing out that such a plan is wildly unsafe for a growing kid. It’s like AI saw “diet” and went full throttle without considering the consequences.
The problem? AI like ChatGPT pulls from massive datasets but doesn’t always filter for accuracy or safety. It’s a bit like Googling symptoms and convincing yourself you have a rare disease except this time, the advice can actually harm you.
Why You Should Stick to Experts for Health Advice
Look, I get it AI is tempting. It’s fast, free, and feels like a personal assistant. But when it comes to your health, it’s not worth the risk. This guy was lucky; after three weeks of treatment, he fully recovered.
But not everyone might be so fortunate. For reliable health and diet advice, stick to professionals. The Centers for Disease Control and Prevention (CDC) or a registered dietitian can give you personalized, safe guidance that won’t land you in the ER.
