Categories
Chatbots and Virtual Assistants

Expert Warns of AI Chatbot Risks After Teen User’s Suicide [Video]

The tragic suicide of 14-year-old Sewell Setzer III made headlines around the country in October after his mother, Megan Garcia, filed a wrongful death lawsuit alleging her son had become isolated from reality while he spent months obsessively messaging an AI-powered chatbot whom he “loved.”

The roleplaying program released in 2022 by Character.AI allows users to communicate with computer-generated characters that mimic many of the behaviors of real people (they can even talk), which Garcia argued blurs the boundaries of what’s real and fake with a promise of 24/7 companionship, despite a label on the platform that the content of its bots is fictional.

What’s more, according to allegations in Garcia’s suit, the bot that her son was closest to, modeled on Game of Thrones‘ Daenerys Targaryen, didn’t have proper guardrails when it came to sensitive content: The bot traded sexual messages with the teen while not preventing talk of suicide.

“It’s an experiment,” Garcia told PEOPLE, “and I think my child …

Watch/Read More