Two families sue Character.AI over youth safety concerns, seek to shut down platform
CNN
Two families have sued artificial intelligence chatbot company Character.AI, accusing it of providing sexual content to their children and encouraging self-harm and violence. Theirs is the second lawsuit filed against Character.AI over youth safety concerns since October.
Two families have sued artificial intelligence chatbot company Character.AI, accusing it of providing sexual content to their children and encouraging self-harm and violence. The lawsuit asks a court to shut down the platform until its alleged dangers can be fixed. Brought by the parents of two young people who used the platform, the lawsuit alleges that Character.AI “poses a clear and present danger to American youth causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others,” according to a complaint filed Monday in federal court in Texas. For example, it alleges that a Character.AI bot implied to a teen user that he could kill his parents for limiting his screentime. Character.AI markets its technology as “personalized AI for every moment of your day” and allows users to chat with a variety of AI bots, including some created by other users or that users can customize for themselves. The bots can give book recommendations and practice foreign languages with users and let users chat with bots that purport to take on the personas of fictional characters, like Edward Cullen from Twilight. One bot listed on the platform’s homepage Monday, called “Step Dad,” described itself an “aggressive, abusive, ex military, mafia leader.” The filing comes after a Florida mother filed a separate lawsuit against Character.AI in October, claiming that the platform was to blame for her 14-year-old son’s death after it allegedly encouraged his suicide. And it comes amid broader concerns about relationships between people and increasingly human-like AI tools.
1-star McDonald’s reviews and sympathetic merch: Companies try to stop online support for CEO killer
After police found the words “deny,” “defend” and “depose” printed on shell casings near the site where UnitedHealthcare CEO Brian Thompson was gunned down, merchandise bearing those words started to appear online.