In the months after Sewell Setzer III’s February suicide, his mother, Megan Garcia, was at a loss over whether she should speak out about the events that she believes led to his death.
Her 14-year-old son, she learned shortly after he died, had fallen in “love” with an eerily lifelike, sexualized, AI-powered chatbot modeled after the Game of Thrones character Daenerys Targaryen.
Garcia claims their pseudo-relationship, through the app Character.AI, eventually drove Sewell to fatally shoot himself in his bathroom at the family’s Orlando, Fla., home.
In October, Garcia, a 40-year-old attorney and mom of three boys, filed a wrongful death lawsuit against Character.AI, arguing its technology is “defective and/or inherently dangerous.” The company has not yet responded in court but insists user safety is a top priority.
“I deliberated for months if I should share his story,” Garcia tells PEOPLE in this week’s issue. “I’m still his mother and I want to protect him, even in death.”
“But the …