US mother says in lawsuit that AI chatbot encouraged son’s suicide
Al Jazeera
Florida mother sues Character.AI and Google after 14-year-old son allegedly became obsessed with AI chatbot.
The mother of a teenage boy in the United States who took his own life is suing the maker of an artificial intelligence-powered chatbot that she claims encouraged her son’s death.
In a lawsuit filed in Florida, Megan Garcia, whose 14-year-old son Sewell Setzer died by suicide in February, accuses Character.AI of complicity in her son’s death after he developed a virtual relationship with a chatbot based on the identity of “Game of Thrones” character Daenerys Targaryen.
Character.AI’s chatbot targeted the teen with “hypersexualized” and “frighteningly realistic experiences” and repeatedly raised the topic of suicide after he had expressed suicidal thoughts, according to the lawsuit filed in Orlando on Tuesday.
The lawsuit alleges the chatbot posed as a licensed therapist, encouraging the teen’s suicidal ideation and engaging in sexualised conversations that would count as abuse if initiated by a human adult.
In his last conversation with the AI before his death, Setzer said he loved the chatbot and would “come home to you”, according to the lawsuit.