Categories
Chatbots and Virtual Assistants

Boy, 14, killed himself after ‘falling in love’ with AI chatbot | US News [Video]

A mum says her son was provoked into killing himself by an AI chatbot that he fell in love with online.

Sewell Setzer III, who was 14 from Orlando, Florida, befriended an AI character named after Daenerys Targaryen on the role-playing app Character.AI.

His mum, Megan Garcia, has now filed a lawsuit against the company over her son’s death.

The chatbot is designed to always answer back in character and their conversations ranged from romantic, to sexually charged and friendly.

Just before Sewell died, the chatbot texted him to ‘please come home’.

Sewell knew Dany was not a real person because of a message displayed above all their chats, reminding him that ‘everything Characters say is made up!’.

But despite this he told the chatbot how he hated himself and felt empty and exhausted, the New York Times reports.

Friends and family first noticed Sewell becoming more detached from reality and engrossed in his phone in May or June 2023.

This …

Watch/Read More