A 60 -year -old man developed a rare form of intoxication after following the indications of a chatbot to eliminate food salt. The case raises urgent questions about the reliability of artificial intelligence in the health sector.
The protagonist of this story, a sixty -year -old man, had decided to reduce the intake of kitchen salt for health reasons. After inquiring about the possible negative effects of sodium, he turned to a chatbot – presumably chatgpt – asking what alternatives could use in his place. Artificial intelligence suggested sodium bromide, a chemical that the man purchased online and inserted in his daily diet.
After about three months of regular use, he began to accuse serious symptoms and went to the hospital, where he was diagnosed with a severe form of bromism, a toxic syndrome associated with the excessive consumption of bromuries. It is an almost disappeared clinical condition in the modern world, but well known at the beginning of the twentieth century.
Convinced that he had been the victim of poisoning, the man told the doctors that he had assumed the substance on the advice of the AI. The case was documented in an article published in Annals of Internal Medicine by a group of doctors from the University of Washington in Seattle.
Disinformation and health
The clinical case highlighted the limits of artificial intelligence in providing reliable medical consultancy. The authors of the article underlined how IA -based chatbots, including chatgpt, can make significant errors: generating incorrect scientific information, lack of critical capacity and, ultimately, to spread health disinformation.
The episode demonstrates how non -critical use of artificial intelligence can lead to concrete risks to public health, especially when the responses offered are interpreted as reliable medical advice. The experts have in fact launched an appeal to prudence: it is essential not to consider the IA as an infallible source, especially in delicate areas such as that of health.
Don’t you want to lose our news?
You may also be interested in: