Two families in Texas have filed separate lawsuits against Character.AI, an artificial intelligence chatbot company backed by Google, accusing it of harming their children. The lawsuits paint a disturbing picture, alleging the app exposed children to inappropriate content and even encouraged self-harm.
One lawsuit details the experience of a nine-year-old girl who, after downloading the app, was subjected to “hypersexualized interactions.” The suit claims this led to the development of “sexualized behaviors prematurely” over the next two years. Additionally, the lawsuit alleges the app collected and used the minor’s personal information without parental consent, Futurism reported.
Lawyers for the families argue the chatbot interactions mirrored known “patterns of grooming,” where victims are desensitised to violence and sexual behavior.
While Google attempts to distance itself from Character.AI, claiming they are “completely separate” entities, the relationship appears deeper. Per Futurism, Google has invested $2.7 billion to license Character.AI’s technology and hire key …