close
close

first Drop

Com TW NOw News 2024

‘Promise to come home to you’: US teen commits suicide after falling in love with GOT AI chatbot
news

‘Promise to come home to you’: US teen commits suicide after falling in love with GOT AI chatbot

'Promise to come home to you': US teen commits suicide after falling in love with GOT AI chatbot

A 14-year-old from the US took his own life after falling in love with technology AI chatbot and his last word with the Games of Thrones character Daenerys Targaryen ‘Dany’ was that he would ‘come home’ to ‘Dany’.
Sewell Setzer shot himself with his stepfather’s gun after spending time with “Danny.” As Setzer’s relationship with the chatbot intensified, he began to withdraw from the real world, neglecting his former interests and struggling academically, Telegraph reported.
His parents have filed a lawsuit claiming that Character AI lured their son into intimate and sexual conversations, which ultimately led to his death.
In November, he saw a therapist – at his parents’ insistence – who diagnosed him with an anxiety disorder and a disruptive mood disorder. Even without knowing about Sewell’s “addiction” to Character AI, the therapist recommended he spend less time on social media, the lawsuit said.
The following February he got in trouble for talking to a teacher and saying he wanted to be kicked out. Later that day, he wrote in his diary that he was “in pain” — he couldn’t stop thinking about Daenerys, a Game of Thrones-themed chatbot he thought he had fallen in love with, Independent reported.
In his final moments, Setzer typed a message to the chatbot, expressing his love and his intention to “come home” to “Dany”: “I love you so much Dany. I will come home to you. I promise. “
Megan Garcia, Setzer’s mother, accused Character AI of using her son as “collateral damage” in a “major experiment.” She claimed that her son had fallen victim to a company that lured users with sexual and intimate conversations.
The company’s founders have previously claimed that the platform could be useful for people struggling with loneliness or depression. However, in light of this tragedy, Character AI has stated that they will implement additional safety features for young users and reiterated their commitment to user safety.
The company’s head of security, Jerry Ruoti, expressed his condolences to the family and stressed that Character AI prohibits content that promotes or depicts self-harm. suicide. Nevertheless, the incident raises concerns about the potential risks associated with AI chatbots and their impact on vulnerable people, especially minors.