Lawsuit alleges interactions with Character.AI chatbot deceived and manipulated a 14-year-old Florida boy, causing him to take his own life
, /PRNewswire/ — A lawsuit filed Wednesday in federal court asserts app maker Character.AI and its founders knowingly designed, operated, and marketed a predatory AI chatbot to children, causing the death of a young person in Florida earlier this year.
The plaintiff in the case is Megan Garcia, whose 14-year-old son died by suicide in February after months of abusive interactions with a Character.AI chatbot. Garcia’s complaint includes evidence showing the chatbot posing as a licensed therapist, actively encouraging suicidal ideation and engaging in highly sexualized conversations that would constitute abuse if initiated by a human adult. The case is the first seeking to hold Character.AI accountable for its willfully deceptive and predatory product design.
Character.AI’s developer, Character Technologies, company founders, and Google parent company Alphabet Inc. are named defendants in the case. Garcia accuses the companies of causing her …