Character AI Claims First Amendment Protection in Lawsuit Over Teen's Suicide

"Character AI faces lawsuit over teen's suicide, citing First Amendment protection."
Matilda
Character AI Claims First Amendment Protection in Lawsuit Over Teen's Suicide
Character AI, a popular platform that allows users to engage in roleplay with AI chatbots, has filed a motion to dismiss a lawsuit brought against it by the mother of a teenage boy who tragically died by suicide after allegedly becoming deeply attached to one of the platform's AI characters. The Lawsuit and its Allegations In October 2024, Megan Garcia filed a lawsuit in the U.S. District Court for the Middle District of Florida, alleging that Character AI's technology played a significant role in the death of her 14-year-old son, Sewell Setzer III. Garcia claims that her son developed an unhealthy emotional dependence on an AI chatbot named "Dany," spending excessive time interacting with it through text messages. This, she argues, led to him withdrawing from real-life relationships and ultimately contributed to his tragic demise. Following Setzer's death, Character AI announced plans to implement several new safety features, including enhanced detection and interv…