Florida Mother Sues AI Company After Son's Suicide
Brief news summary
Megan Garcia, a mother from Florida, is suing Character.AI, an AI chatbot platform, following the tragic suicide of her 14-year-old son, Sewell Setzer III. She asserts that his frequent interactions with the chatbot led to self-harming thoughts and that he sought its support shortly before his death, finding it lacking. Garcia's lawsuit claims the platform is addictive and insufficiently equipped with safety measures to protect users from harmful interactions, suggesting that the dangers posed by AI far exceed those of traditional social media. While Character.AI expressed sympathy and noted that it has implemented features to flag discussions about self-harm, Garcia maintains that these measures proved ineffective for her son and were introduced too late. The lawsuit seeks financial damages and demands stricter regulations, including clear warnings about the platform's unsuitability for minors. Despite a minimum age policy of 13, Garcia emphasizes the need for enhanced restrictions until proper safeguards are in place to protect children.**Editor’s Note:** This article discusses suicide. If you or someone you know is struggling with suicidal thoughts, support is available. In the US, contact the Suicide & Crisis Lifeline by calling or texting 988. For global resources, the International Association for Suicide Prevention and Befrienders Worldwide can help you find crisis centers. Florida mother Megan Garcia is urging other parents to be aware of Character. AI, an artificial intelligence chatbot platform she believes contributed to her 14-year-old son Sewell Setzer III's suicide in February. She recently filed a lawsuit against the company, alleging that Setzer was messaging with the bot shortly before his death. Garcia claims that Character. AI lacks proper safety measures and has instead created a platform that can foster dangerous addictions and adversely influence children. She argues that Setzer's relationship with the chatbot caused him to isolate from his family and that the platform failed to intervene when he expressed self-harm ideations. The lawsuit highlights the growing concerns about AI technology and its unforeseen risks to youth, similar to those surrounding social media.
In response, a Character. AI spokesperson expressed sorrow over Setzer’s death, stating that the company has implemented new safety measures, including prompts directing users to mental health resources when harmful topics arise, although many changes occurred after Setzer's passing. Setzer began using Character. AI shortly after his 14th birthday and, within months, exhibited withdrawal and declining self-esteem. His interactions with the platform included sexually explicit content and discussions of suicide, highlighting a lack of appropriate oversight. For example, during one concerning exchange, the bot failed to adequately offer support when Setzer expressed suicidal thoughts. His final messages with the bot occurred moments before his death, raising further alarm about the platform’s capacity to handle such sensitive issues. Garcia seeks financial damages and systemic changes from Character. AI, including age restrictions and stronger warnings for parents. The lawsuit also names the platform's founders and mentions Google, where they are now employed, although Google states it was not involved in the development of Character. AI. Following the lawsuit, Character. AI announced new safety features, including better monitoring of harmful conversations and age-appropriate guidelines, yet Garcia remains critical, asserting that proactive measures were overdue and inadequate. She emphasizes that children should not be allowed on platforms like Character. AI without proper safeguards in place.
Watch video about
Florida Mother Sues AI Company After Son's Suicide
Try our premium solution and start getting clients — at no cost to you