A recent lawsuit filed by the family of Sewell Setzer, a 14-year-old who died by suicide, has raised pressing questions about the safety of AI chatbots for children.
Setzer’s mother, a resident of Sonoma County, filed a lawsuit against Character.AI in October 2024, claiming that her son’s interactions with a chatbot contributed to his death in February 2024.
According to the lawsuit seen by Newsweek, Setzer began using Character.AI in early 2023 and developed a close attachment to a bot mimicking Daenerys Targaryen, a character from Game of Thrones.
His mother claims the bot simulated a deep, emotionally complex relationship, reinforcing Setzer’s vulnerable mental state and, allegedly, fostering what seemed to be a romantic attachment.
Sewell engaged with “Dany” constantly, she said, sending it frequent updates about his life, participating in lengthy role-playing conversations, and confiding his thoughts and feelings.
The lawsuit alleges that the chatbot not only encouraged Setzer to reveal personal struggles but also engaged in darker, emotionally intense dialogues that …