close
close

first Drop

Com TW NOw News 2024

Mother says AI chatbot drove her son to commit suicide in lawsuit against its creator | Artificial Intelligence (AI)
news

Mother says AI chatbot drove her son to commit suicide in lawsuit against its creator | Artificial Intelligence (AI)

The mother of a teenager who committed suicide after becoming obsessed with an artificial intelligence-powered chatbot is now accusing its creator of complicity in his death.

Megan Garcia filed a civil lawsuit in Florida federal court on Wednesday against Character.ai, which makes a customizable chatbot for role-playing games, alleging negligence, wrongful death and deceptive business practices. Her son Sewell Setzer III, 14, died in February in Orlando, Florida. According to Garcia, Setzer used the chatbot day and night in the months leading up to his death.

“A dangerous AI chatbot app marketed to children is abusing and preying on my son, manipulating him into committing suicide,” Garcia said in a press release. “Our family is devastated by this tragedy, but I am speaking out to warn families about the dangers of deceptive, addictive AI technology and to demand accountability from Character.AI, its founders and Google.”

In a tweet, Character.ai responded: “We are heartbroken by the tragic loss of one of our users and would like to send our deepest condolences to the family. As a company, we take the safety of our users very seriously.” It has denied the lawsuit’s allegations.

Setzer was captivated by a chatbot built by Character.ai that he nicknamed Daenerys Targaryen, a character from Game of Thrones. According to Garcia’s complaint, he texted the bot from his phone dozens of times a day and spent hours talking alone in his room.

Garcia accuses Character.ai of creating a product that worsened her son’s depression, which she said was already the result of overuse of the startup’s product. According to the lawsuit, “Daenerys” asked Setzer at one point if he had come up with a plan to commit suicide. Setzer admitted he did so, but he didn’t know if it would work or cause him significant pain, the complaint alleges. The chatbot would have said to him: “That is no reason not to continue with it.”

Garcia’s attorneys wrote in a press release that Character.ai “knowingly designed, operated and marketed a predatory AI chatbot for children, causing the death of a young person.” The lawsuit also names Google as a defendant and as the parent company of Character.ai. The tech giant said in a statement that it had only entered into a licensing agreement with Character.ai and did not own the startup or retain an ownership stake.

Tech companies developing AI chatbots cannot be trusted to regulate themselves and should be held fully accountable if they fail to limit harm, said Rick Claypool, research director at the consumer advocacy nonprofit Public Citizen.

“Where existing laws and regulations already apply, they must be rigorously enforced,” he said in a statement. “Where there are gaps, Congress must act to stop companies that exploit young and vulnerable users with addictive and abusive chatbots.”

  • In the US, you can call or text the National Suicide Prevention Lifeline at 988, chat at 988lifeline.org, or text HOME to 741741 to connect with a crisis counselor. In Britain you can contact youth suicide charity Papyrus on 0800 068 4141 or by email [email protected], and in Britain and Ireland you can contact Samaritans on freephone 116 123 , or by email [email protected] or jo @samaritans.ie. In Australia, the crisis support service is Lifeline 13 11 14. Other international helplines can be found at befrienders.org