Bad AI Advice Sending People to the ER
Multiple documented cases show AI health chatbots providing harmful, incomplete or inaccurate medical advice with dangerous consequences...
A man tried to strangle a growth on his anus after consulting AI for medical advice and ended up in the emergency room.
Dr. Darren Lebl, research service chief of spine surgery for the Hospital for Special Surgery in New York, told The Post: "A lot of patients will come in, and they will challenge their [doctor] with some output that they have, a prompt that they gave to, let's say, ChatGPT."
"The problem is that what they're getting out of those AI programs is not necessarily a real, scientific recommendation with an actual publication behind it. About a quarter of them were … made up."