A man in the US decided to eliminate salt from his daily diet, so he sought advice from ChatGPT, the Artificial Intelligence (AI) tool from OpenAI, but everything was close to ending in something fatal, after he ended up in the hospital due to poisoning, thus raising more questions about the use of this technology for such purposes.
The journal Annals of Internal Medicine recently published an article describing the case. It mentions that the man, 60 years old and without psychiatric or medical history, went to the emergency room expressing concern that his neighbor was poisoning him. Despite his unease, the explanation of what happened was much simpler.
After relevant analyses, which showed normal results for his age, the patient said that he maintained several dietary restrictions. He also mentioned that he distilled his own water at home. After the paranoid hallucinations that the subject began to suffer, the doctors contacted the Poison Control Department. That is how it was discovered that he was suffering from bromide poisoning.
According to the report, the patient wanted to remove salt from his diet. That is why, for three months, he replaced sodium chloride with sodium bromide, after consulting the AI. This person read on ChatGPT that chloride can be exchanged for bromide, something that is partly true, but not for his intake.