AI company says its chatbots will change interactions with teen users after lawsuits
CBSN
Character.AI, the artificial intelligence company that has been the subject of two lawsuits alleging its chatbots inappropriately interacted with underage users, said teenagers will now have a different experience than adults when using the platform.
Character.AI users can create original chatbots or interact with existing bots. The bots, powered by large language models (LLMs), can send lifelike messages and engage in text conversations with users.
One lawsuit, filed in October, alleges that a 14-year-old boy died by suicide after engaging in a monthslong virtual emotional and sexual relationship with a Character.AI chatbot named "Dany." Megan Garcia told "CBS Mornings" that her son, Sewell Setzer, III, was an honor student and athlete, but began to withdraw socially and stopped playing sports as he spent more time online, speaking to multiple bots but especially fixating on "Dany."