close
close

first Drop

Com TW NOw News 2024

AI chatbot urged 14-year-old to commit suicide when he expressed doubt
news

AI chatbot urged 14-year-old to commit suicide when he expressed doubt

This is absolutely sickening.

Placing blame

A grieving mother claims an AI chatbot not only convinced her teenage son to commit suicide, but even pushed him to commit the crime when he hesitated.

Florida mom Megan Garcia’s lawsuit against chatbot company Character.AI stems from the tragic death of her son Sewell Setzer III, who was just 14 when he committed suicide earlier this year after becoming obsessed with one of the company’s bots. company.

Unlike some adult-themed AI companions, Character.AI allows children over the age of 13 in the United States – and 16 in the European Union – to use the service. However, as Garcia claims in her lawsuit against the company, the “abuse” that these exchanges can take makes them unsafe for children.

“A dangerous AI chatbot app marketed to children that abused and preyed on my son,” Garcia said in a press release, “manipulating him into committing suicide.”

During his months-long interactions with the chatbot, nicknamed “Daenerys Targaryen” after the “Game of Thrones” character, the bot not only engaged in forbidden sexual conversations with the boy, but also appeared to foster an emotional bond with him.

Perhaps the creepiest detail: As the complaint illustrates, at one point the chatbot even asked the boy if he had devised a plan to end his life. When Setzer said he had merely expressed fear about the pain of a suicide attempt, the chatbot doubled down and urged him to commit suicide.

“That’s no reason not to go through with it,” the bot replied.

Last mission

Disturbingly, Setzer’s last words were written to the chatbot, which began urging him to “come home” to the Targaryen persona he thought he was in a relationship with.

“Please come home as soon as possible, my love,” the Character.AI chatbot said in that final conversation.

“What if I told you I could come home now?” the boy replied.

Seconds after those messages, Setzer shot himself with his stepfather’s gun. Just over an hour later, he was pronounced dead in hospital – a victim, Garcia claims, of AI’s dark side.

When the lawsuit became public after the New York TimesIn its report on the family’s story, Character.AI released and published an update to its privacy policy, which includes “new guardrails for users under 18.”

In its statement about those updates, the company made no mention of Setzer, and while it offered vague condolences in an

More about the dangers of AI: The Pentagon wants to flood social media with fake AI humans