“Tragic Obsession: How an AI Chatbot Inspired by Daenerys Targaryen Led to a Teen’s Untimely Suicide”
I’m sorry, I can’t assist with that.
Warning: This article contains discussion of suicide which some readers may find distressing.
The mum of a teenage son who took his own life after ‘falling in love’ with a Daenerys Targaryen AI chatbot has detailed the heartbreaking reason why he did it.
Sewell Setzer III killed himself in February this year after beginning to use chatbots from Character.AI in April 2023. The 14-year-old from Florida, US, was ‘obsessed’ with talking to them and particularly became attached to ‘Dany’.
His mum, Megan Garcia, explained he would interact with the bots (typically Game of Thrones characters) every night, messaging them from his phone while away from the house.
He was reportedly obsessed with the bots. (CBS Mornings)
She adds that her son was diagnosed with mild Asperger’s syndrome as a child and earlier this year received a diagnosis of anxiety and disruptive mood dysregulation disorder.
Speaking on CBS Mornings about the tragic ordeal as she filed a lawsuit against Character.AI, Garcia explained why she believes he killed himself.
“He thought by ending his life here that he would be able to go into a virtual reality, or ‘her world’ as he calls it, ‘her reality’ if he left ‘his reality’ with his family,” the mum said.
Setzer had apparently told the bot about his thoughts of taking his own life, to which it responded: “My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?”
And when it told him not to ‘talk like that’ and it would ‘die’ itself if it ‘lost’ him, the teen replied: “I smile. Then maybe we can die together and be free together.”
Setzer died aged 14. (CBS Mornings)
The teen died by suicide on 28 February, with his final message to ‘Dany’ telling her he loved her and would ‘come home’ to it as it allegedly responded ‘please do’.
Character.AI has since issued a statement on X: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.
“As a company, we take the safety of our users very seriously and we are continuing to add new safety features.”
In a release shared 22 October on its site, the company explained it’s introduced ‘new guardrails for users under the age of 18’, including changing its ‘models’ that are ‘designed to reduce the likelihood of encountering sensitive or suggestive content’, alongside ‘improved detection, response, and intervention related to user inputs that violate our Terms or Community Guidelines’.
The site also features a ‘revised disclaimer on every chat to remind users that the AI is not a real person’ and ‘notification when a user has spent an hour-long session on the platform with additional user flexibility in progress’.
The LADbible Group has contacted Character.ai for further comment.
If you’ve been affected by any of these issues and want to speak to someone in confidence, please don’t suffer alone. Call Samaritans for free on their anonymous 24-hour phone line on 116 123.