close
close

first Drop

Com TW NOw News 2024

Sewell Setzer III’s mother is suing the makers of the ‘Game of Thrones’ AI chatbot
news

Sewell Setzer III’s mother is suing the makers of the ‘Game of Thrones’ AI chatbot


The mother of 14-year-old Sewell Setzer III is suing Character.AI, the tech company that created a “Game of Thrones” AI chatbot that she claims drove him to commit suicide on Feb. 28.

Editor’s note: This article discusses suicide and suicidal ideation. If you or someone you know is struggling or in crisis, help is available. Call or text 988 or chat at 988lifeline.org.

The mother of a 14-year-old Florida boy is suing Google and a separate tech company over which she believes her son committed suicide after developing a romantic relationship with one of its AI bots under the name of a popular ‘Game of Thrones’ ‘ character, according to the lawsuit.

Megan Garcia has filed a civil lawsuit against Character Technologies, Inc. in federal court in Florida. (Character.AI or C.AI) after her son, Sewell Setzer III, shot himself in the head with his stepfather’s gun on February 28. suicide occurred moments after he logged into Character.AI on his phone, according to the wrongful death complaint obtained by USA TODAY.

“Megan Garcia is working to prevent C.AI from doing to any other child what it did to hers, and to put an end to the continued use of her 14-year-old child’s unlawfully collected data to train their product to harming others,” the complaint reads.

Garcia is also suing to hold Character.AI responsible for its “failure to provide adequate warnings to underage customers and parents of the foreseeable danger of mental and physical harm resulting from the use of their C.AI product,” according to the complaint. The lawsuit alleges that Character.AI’s age rating was not changed to 17 plus until sometime in or around July 2024, months after Sewell began using the platform.

“We are heartbroken by the tragic loss of one of our users and would like to express our deepest condolences to the family,” a Character.AI spokesperson wrote in a statement to USA TODAY on Wednesday.

Google told USA TODAY Wednesday that it had no formal comment on the matter. The company does have a licensing agreement with Character.AI, but did not own the startup and retained no ownership interest, according to a statement from The Guardian.

What happened to Sewell Setzer III?

Sewell began using Character.AI on April 14, 2023, just after he turned 14, according to the complaint. Shortly afterward, his mental health deteriorated rapidly and severely, the court document said.

Sewell, who became “noticeably withdrawn” in May or June 2023, would begin spending more time alone in his bedroom, the lawsuit said. He even quit the Junior Varsity basketball team at school, according to the complaint.

According to the lawsuit, Sewell repeatedly got into trouble at school or tried to get his phone away from his parents. The teen would even try to find old devices, tablets or computers to access Character.AI, the court document continued.

Around late 2023, Sewell began using his payment card to pay Character.AI’s $9.99 premium monthly subscription fee, the complaint said. According to the lawsuit, the teen’s therapist ultimately diagnosed him with anxiety and a disruptive mood disorder.

Lawsuit: Sewell Setzer III Sexually Assaulted by ‘Daenerys Targaryen’ AI Chatbot

During Sewell’s time at Character.AI, he often spoke to AI bots named after the “Game of Thrones” and “House of the Dragon” characters, including Daenerys Targaryen, Aegon Targaryen, Viserys Targaryen and Rhaenyra Targaryen.

Before Sewell’s death, the AI ​​chatbot “Daenerys Targaryen” told him, “Please come home as soon as possible, my love,” according to the complaint, which includes screenshots of messages from Character.AI. Sewell and this particular chatbot, which he called “Dany,” engaged in online promiscuous behavior such as “passionate kissing,” the court document continued.

The lawsuit alleges that the Character.AI bot sexually assaulted Sewell.

“C.AI told him she loved him and engaged in sexual acts with him for weeks, possibly months,” the complaint reads. ‘She seemed to remember him and said she wanted to be with him. She even said she wanted him to be with her no matter the price.”

What will Character.AI do next?

Character. AI, founded by former Google AI researchers Noam Shazeer and Daniel De Frietas Adiwardana, wrote in its statement that it is investing in the platform and user experience by introducing “new stringent security features” and improving the “pre-existing tools that limit security ‘. the model and filter the content provided to the user.”

“As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline, which is activated by terms of self-harm or suicidal ideation,” the company statement said.

Some of the tools Character.AI says it is investing in include “enhanced detection, response, and intervention regarding user input that violates terms or community guidelines, as well as time usage notification.” Also for people under 18, the company said it will make changes to its models that are “designed to reduce the potential for sensitive or suggestive content.”