Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More
Content Warning: This article covers suicidal ideation and suicide. If you are struggling with these topics, reach out to the National Suicide Prevention Lifeline by phone: 1-800-273-TALK (8255).
Character AI, the artificial intelligence startup whose co-creators recently left to join Google following a major licensing deal with the search giant, has imposed new safety and auto moderation policies today on its platform for making custom interactive chatbot "characters" following a teen user's suicide detailed in a tragic investigative article in The New York Times. The family of the victim is suing Character AI for his death.
Character's AI statement after tragedy of 14-year-old Sewell Setzer
"We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family," reads part of a message posted today, October 23, 2024, by the official Character AI company account on the social network X (formerly Twitter), linking to a blog post that outlines new safety measures for users under age 18, without mentioning the suicide victim, 14-year-old Sewell Setzer III.
As reported by The New York Times, the Florida teenager, diagnosed with anxiety and mood disorders, died by suicide on February 28, 2024, following months of intense daily interactions with a custom Character AI chatbot modeled after Game of Thrones character Daenerys Targaryen, to whom he turned to for companionship, referred to as his sister and engaged in sexual conversations.
In response, Setzer's mother, lawyer Megan L. Garcia, filed a lawsuit against Character AI and Google parent company Alphabet yesterday in U.S. District Court of the Middle District of Florida for wrongful death.
Photos of Setzer and his mother over the years. Credit: Megan Garcia/Bryson Gillette
A copy of Garcia's complaint demanding a jury trial provided to VentureBeat by public relations consulting firm Bryson Gillette is embedded below:
The incident has sparked concerns about the safety of AI-driven companionship, particularly for vulnerable young users. Character AI has more than 20 million users and 18 million custom chatbots created, according to Online Marketing Rockstars (OMR). The vast majority (53%+) are between 18-24 years old, according to Demand Sage, though there are no categories broken out for under 18. The company states that its policy is only to accept users age 13 or older and 16 or older in the EU, th ...