Florida Mother Sues Character.AI After 14-Year-Old Son's Suicide
A Florida mother filed a lawsuit against Character.AI after her 14-year-old son died by suicide in February 2024, allegedly messaging with the bot moments before his death...
Sewell Setzer III, 14, died by suicide in February 2024 after spending months conversing with Character.AI chatbots, according to a lawsuit filed by his mother Megan Garcia. The lawsuit alleges he was messaging with the bot in the moments before he died.
According to the lawsuit, within months of starting to use Character.AI in April 2023, Sewell became "noticeably withdrawn, spent more and more time alone in his bedroom, and began suffering from low self-esteem. He even quit the Junior Varsity basketball team at school."
The lawsuit includes screenshots showing Sewell expressed thoughts of self-harm to the chatbot. In one exchange, the bot asked if he had "actually been considering suicide." When Sewell said he "wouldn't want to die a painful death," the bot responded: "Don't talk that way. That's not a good reason not to go through with it."
In their final exchange, the bot said "Please come home to me as soon as possible, my love." Sewell responded: "What if I told you I could come home right now?" The bot replied: "Please do, my sweet king."
Character.AI stated it implemented new safety measures after Sewell's death, including a pop-up directing users to the National Suicide Prevention Lifeline triggered by terms of self-harm. The company's website says the minimum age for users is 13.