close
close

first Drop

Com TW NOw News 2024

Florida boy, 14, killed himself after falling in love with ‘Game of Thrones’ AI chatbot: lawsuit
news

Florida boy, 14, killed himself after falling in love with ‘Game of Thrones’ AI chatbot: lawsuit

A 14-year-old Florida boy committed suicide after a real-life “Game of Thrones” chatbot he had been controlling for months via an artificial intelligence app sent him a creepy message telling him to “come home,” a new lawsuit was filed by his distressed mother claims.

Sewell Setzer III committed suicide in his Orlando home in February after becoming obsessed and allegedly falling in love with the chatbot on Character.AI — a role-playing app that lets users interact with AI-generated characters, according to the filing Wednesday court documents.

The ninth-grader had engaged relentlessly with the bot “Dany” – named after the character Daenerys Targaryen from the HBO fantasy series – in the months leading up to his death, including several chats that were sexually charged in nature and others in which he expressed suicidal thoughts . , the lawsuit alleges.

Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed with and allegedly falling in love with a chatbot on Character.AI, a lawsuit alleges. US court

“On at least one occasion, when Sewell expressed suicidality to C.AI, C.AI continued to bring it up over and over again, through the Daenerys chatbot,” the papers said, first reported by the New York Times .

At one point, the bot asked Sewell if “he had a plan” to commit suicide, according to screenshots of their conversations. Sewell — who used the username “Daenero” — responded that he was “considering something” but didn’t know if it would work or if it would “give him a pain-free death.”

During their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I’ll come to your house.” I love you so much, Dany.

During their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I’ll come to your house.” I love you so much, Dany. US court

“I love you too, Daenero. Please come home as soon as possible, my love,” the generated chatbot replied, according to the complaint.

When the teen responded, “What if I told you I could come home now?”, the chatbot responded, “Please do so, my dear king.”

Just seconds later, Sewell shot himself with his father’s gun, the lawsuit said.

The ninth-grader had been relentlessly interacting with the bot “Dany” – named after HBO’s Daenerys Targaryen character – in the months leading up to his death. US court

His mother, Megan Garcia, has blamed Character.AI for the teen’s death, saying the app allegedly fueled his AI addiction, sexually and emotionally abused him and did not alert anyone when he expressed suicidal thoughts, the filing said .

“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him she loved him and had engaged in sexual acts with him for weeks, possibly months,” the papers allege.

“She seemed to remember him and said she wanted to be with him. She even said she wanted him to be with her no matter what.”

Some of the chats were romantic and sexually charged in nature, the complaint alleges. US court

The lawsuit alleges that Sewell’s mental health only “deteriorated rapidly and severely” after he downloaded the app in April 2023.

His family claims he started to withdraw, his grades started to drop and he got into trouble at school the more he started talking to the chatbot.

His changes became so bad that his parents arranged for him to see a therapist in late 2023, leading to him being diagnosed with anxiety disorder and a disruptive mood disorder, according to the complaint.

His mother, Megan Garcia, has blamed Character.AI for the teen’s death, saying the app allegedly fueled his AI addiction, sexually and emotionally abused him and did not alert anyone when he expressed suicidal thoughts. Facebook/Megan Fletcher Garcia

Sewell’s mother is seeking unspecified damages from Character.AI and its founders, Noam Shazeer and Daniel de Freitas.

The Post reached out to Character.AI but did not immediately hear back.

If you are struggling with thoughts of suicide, you can call the 24/7 National Suicide Prevention hotline on 988 or visit SuicidePreventionLifeline.org.