World
Next Story
Newszop

First AI death? Character.ai faces lawsuit after Florida teen's suicide. He was speaking to Daenerys Targaryen

Send Push
A lawsuit filed Wednesday in federal courts revealed that Charater.ai an app that makes up characters has been accused of causing the death of a teenager in Florida earlier this year. The complainant is Megan Garcia whose 14-year-old son committed suicide in February after months of interaction with a chatbot.

14-year-old Sewell Setzer III knew that Daenerys Targaryen, a chatbot based on a character of Game of Thrones, was not a real person but he developed an emotional attachment and used to text the bot constantly. On his last day of his life too, he messaged Dany: "I miss you, baby sister." “I miss you too, sweet brother,” the chatbot replied.

The New York Times reported that Sewell was diagnosed with mild Asperger’s syndrome as a child, but he never had serious behavioral or mental health problems before, his mother said. Earlier this year, after he started getting in trouble at school, his parents arranged for him to see a therapist. He went to five sessions and was given a new diagnosis of anxiety and disruptive mood dysregulation disorder.

"A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life," the teen's mother Megan Garcia said. "Our family has been devastated by this tragedy, but I'm speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI , its founders, and Google."

On February 28, in the bathroom of his mother's house, Sewell told Dany that he loved her and he would soon come home to her. "Please come home to me as soon as possible, my love," Dany said. “What if I told you I could come home right now?” Sewell asked. “… please do, my sweet king,” Dany replied.

Then Sewell put down his phone, took his stepfather's handgun and pulled the trigger.

Loving Newspoint? Download the app now