If you or someone you know needs help, resources or someone to talk to, you can find it at the National Suicide Prevention Lifeline website or by calling 1-800-273-8255. People are available to talk to 24×7.
(NewsNation) — A Florida woman is suing an AI chatbot creator, claiming her 14-year-old son died by suicide after he became consumed by a relationship with a computer-generated girlfriend.
The mother, Meg Garcia, filed the lawsuit Wednesday in Florida federal court. She says Character Technologies Inc. — the creators of Character.AI chatbot — should have known the damage the tool could cause.
The 138-page document accuses Character Technologies Inc. of liability, negligence, wrongful death and survivorship, unlawful enrichment, violations of Florida’s Deceptive and Unfair Trade Practices Act, and intentional infliction of emotional distress, among other claims.
Georgia, candidates target of ‘sustained’ cyberattacks: Reports
The lawsuit requests Character.AI limit the collection and use of minors’ data, introduce filters …