Warning: This story contains discussion of suicide.
A Florida mother has sued artificial intelligence chatbot startup Character.AI, accusing it of causing her 14-year-old son’s suicide in February and saying he became addicted to the company’s service and deeply attached to a chatbot it created.
In a lawsuit filed Tuesday in Orlando federal court, Megan Garcia said Character.AI targeted her son, Sewell Setzer, with “anthropomorphic, hypersexualized, and frighteningly realistic experiences.”
She said the company programmed its chatbot to “misrepresent itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell’s desire to no longer live outside” of the world created by the service.
The lawsuit also said he expressed thoughts of suicide to the chatbot, which the chatbot repeatedly brought up again.
“We are heartbroken by the tragic loss of one of our users and want to express …