man chatgpt used Dietary advice made herself addicted and got caught up in the hospital.
A 60-year-old man trying to remove table salt from his diet Health reasonsusing a large-scale language model (LLM), we obtained suggestions for replacing it, according to case studies published in the chronicles of internal medicine.
When ChatGpt proposed replacing sodium chloride (table salt) in sodium bromide, the man made a replacement for three months, but according to an article in the journal, the recommendation is likely to mention other purposes such as cleaning.
ChatGpt can quietly rewire your brain, as experts warn you for long-term use
Sodium bromide is a salt-like compound, but it is toxic to human consumption.
It was once used Anticonvulsive and sedativesbut today, according to the National Institutes of Health, it is mainly used for cleaning, manufacturing and agriculture purposes.
When the man arrived at the hospital, he reported Experience fatigueAll symptoms of bromism are conditions caused by prolonged exposure to sodium bromide – insomnia, poor coordination, acne on the face, cherry hemangioma (red bumps in the skin) and excessive thirst.
The man also said he showed signs of paranoia as his neighbor claimed he was trying to poison him.
Artificial Intelligence detects cancer with 25% more accuracy than UCLA research physicians
He was also known to have auditory and visual hallucinations, and was placed in a psychiatric chunk after he finally tried to escape.
The man was treated with intravenous fluid and electrolytes, and was also given antipsychotics. He was released from the hospital after three weeks of surveillance.
“This case also highlights how to use it artificial intelligence (AI) could potentially contribute to the development of preventable and harmful health outcomes,” the researchers wrote in a case study.
“Unfortunately, we have no access to his ChatGpt conversation logs and we cannot guarantee that, as individual responses are unique and built from previous inputs, we cannot know for certain what exactly the output he received is.”
That’s “very unlikely” a Human doctor They pointed out that when talking to patients seeking alternatives to sodium chloride, they would have mentioned sodium bromide.
New AI tools analyze face photos and predict health outcomes
“It’s important to consider that chatgpt and other AI systems can generate it. Scientific inaccuracylacking the ability to critically discuss the results and ultimately promote the spread of misinformation,” the researchers concluded.
Dr. Jacob Glanville, CEO of San Francisco biotechnology company Centivax, emphasized that people shouldn’t use ChatGpt as an alternative to doctors.
“These are language prediction tools. They lack common sense and produce terrible outcomes if they don’t apply their common sense when deciding what to ask these systems and whether to listen to recommendations.”
“This is a classic example of the problem. The system essentially says, “Want a salt alternative? Sodium bromide is the highest score replacement here, as it is often listed as a sodium chloride alternative in chemical reactions.”
Board certified Dr. Harvey Castro Emergency doctor A Dallas-based national speaker on artificial intelligence confirmed that AI is a tool, not a doctor.
“Large language models generate texts by predicting the most statistically likely sequence of words rather than fact checking,” he told Fox News Digital.
“ChatGpt’s Bromide Blunder shows why the context is king Health advice“Castro continued.” AI is not a substitute for professional medical judgment, in line with Openai’s disclaimer. ”
Castro also warned that there is a “regulatory gap” when it comes to getting it using LLM Medical Information.
“The FDA’s ban on bromide does not extend to AI advice. Global health AI surveillance remains undefined,” he said.
There is also the risk that LLMS may have data bias and lack of validation, leading to hallucination information.
Click here to sign up for our health newsletter
“If your training data contains outdated, rare or chemical focus references, the model could surface them in inappropriate contexts, such as bromide as a salt substitute,” Castro said.
“In addition, current LLM does not incorporate cross-checking for the latest ones. Medical Database Unless explicitly integrated. ”
To prevent such cases, Castro sought more protection measures for LLM, including an integrated medical knowledge base, automatic risk flags, context prompts, and combinations of human and AI surveillance.
The expert added: Regulation and monitoringrare cases like this will likely recur. ”
Visit us for more health articles www.foxnews.com/health
Openai, the San Francisco-based manufacturer of ChatGpt, has provided the following statement to Fox News Digital:
“Our conditions state that ChatGPT is not intended to be used to treat health conditions and is not a substitute for expert advice. We have a safety team working to reduce risk and we are encouraging people to train AI systems to seek professional guidance.”
