Character AI Imposes New Safety Rules After Teen User Commits Suicide
The post Character AI Imposes New Safety Rules After Teen User Commits Suicide appeared on BitcoinEthereumNews.com.
AI-powered chatbot platform Character AI is introducing “stringent” new safety features following a lawsuit filed by the mother of a teen user who died by suicide in February. The measures will include “improved detection, response and intervention related to user inputs that violate our Terms or Community Guidelines,” as well as a time-spent notification, a company spokesperson told Decrypt, noting that the company could not comment on pending litigation. However, Character AI did express sympathy for the user’s death, and outlined its safety protocols in a blog post Wednesday. “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.ai tweeted. “As a company, we take the safety of our users very seriously.” We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features that you can read about here:… — Character.AI (@character_ai) October 23, 2024 In the months before his death, 14-year-old Florida resident Sewell Setzer III had grown increasingly attached to a user-generated chatbot named after Game of Thrones character Daenerys Targaryen, according to the New York Times. He often interacted with the bot dozens of times per day and sometimes exchanged romantic and sexual content. Setzer communicated with the bot in the moments leading up to his death and had previously shared thoughts of suicide, the Times reported. Setzer’s mother, lawyer Megan L. Garcia, filed a lawsuit Tuesday seeking to hold Character AI and its founders, Noam Shazeer and Daniel De Freitas, responsible for her son’s death. Among other claims, the suit alleges that the defendants “chose to support, create, launch, and target…
Filed under: News - @ October 24, 2024 12:24 am