New York (CNN) — Two families have sued artificial intelligence chatbot company Character.AI, accusing it of providing sexual content to their children and encouraging self-harm and violence. The lawsuit asks a court to shut down the platform until its alleged dangers can be fixed.
Brought by the parents of two young people who used the platform, the lawsuit alleges that Character.AI “poses a clear and present danger to American youth causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others,” according to a complaint filed Monday in federal court in Texas.
For example, it alleges that a Character.AI bot implied to a teen user that he could kill his parents for limiting his screentime.
Character.AI marketsits technology as “personalized AI for every moment of your day” and allows users to chat with a variety of AI bots, including some created by other users or that users …