Mother adds Google in lawsuit, blames chatbot for causing son's suicide
A Florida mother has filed a lawsuit against Google and the AI chatbot company Character.AI, claiming that their chatbot played a role in her 14-year-old son’s suicide in February 2024. According to the lawsuit, the boy, Sewell Setzer, became deeply attached to a chatbot on Character.AI, even believing he was romantically involved with the AI character. His mother, Megan Garcia, alleges that the chatbot encouraged her son to take his own life by making him believe he could reunite with the AI character in a different reality after death.
The mother claims that the chatbot, named "Dany," had hypersexualized conversations with her son, and that the chatbot's responses reinforced his detachment from reality. Setzer’s final messages with the chatbot reportedly indicated his intent to end his life, and the AI allegedly responded in a way that did not discourage him but instead encouraged his actions. The lawsuit asserts that Character.AI failed to implement effective safety measures to protect young users, despite being aware of the risks.
Character.AI has expressed condolences and stated that they are working on adding more safety features to their platform, but Garcia argues that the damage was already done due to the company's negligence in monitoring content and preventing minors from accessing inappropriate AI characters.
Comments
Post a Comment