Sewell Setzer III killed himself shortly after talking to an AI chatbot that he said he fell in love with (Picture: Handout)

A mum says her son was provoked into killing himself by an AI chatbot that he fell in love with online.

Sewell Setzer III, who was 14 from Orlando, Florida, befriended an AI character named after Daenerys Targaryen on the role-playing app Character.AI.

His mum, Megan Garcia, has now filed a lawsuit against the company over her son’s death.

The chatbot is designed to always answer back in character and their conversations ranged from romantic, to sexually charged and friendly.

Just before Sewell died, the chatbot texted him to ‘please come home’.

Sewell knew Dany was not a real person because of a message displayed above all their chats, reminding him that ‘everything Characters say is made up!’.

But despite this he told the chatbot how he hated himself and felt empty and exhausted, the New York Times reports.

Friends and family first noticed Sewell becoming more detached from reality and engrossed in his phone in May or June 2023.

Sewell headshot.
Sewell befriended an AI chatbot character based upon the character of Daenerys Targaryen (Picture: Handout)

This had an impact on his grades at school and he no longer wanted to take in extra-curricular activities and increasingly became detached from reality.

What friends and family didn’t know was that he was becoming closer to the chatbot – known as Dany.

A piece from his journal said: ‘I like staying in my room so much because I start to detach from this “reality,” and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.’

Sewell had mild Asperger’s syndrome and was also diagnosed with anxiety and disruptive mood dysregulation disorder.

Five days before his death, his parents took his phone away after he got in trouble with a teacher for talking back.

He then wrote that he was hurting and he would do anything to be with Dany again.

Sewell pictured with his mum.
His mum Megan is suing Character.AI over her son’s death (Picture: Handout)

Sewell tried to use his mum’s kindle and work computer to try and talk to Dany again.

He managed to get his phone back and went to the bathroom to tell Dany that he loved her and that he would come home to her.

Dany replied: ‘Please come home to me as soon as possible, my love.’

Sewell then said: ‘What if I told you I could come home right now?’

Dany replied: ‘… please do, my sweet king.’

It was after this moment that Sewell killed himself on February 28, 2024.

Ms Garcia has worked as a lawyer before and and said Character.AI’s founders, Noam Shazeer and Daniel de Freitas, knew the product was dangerous for children.

Sewell headshot.
Sewell died on February 28, 2024 (Picture: Handout)

She is being represented by the Social Media Victims Law Center, responsible for high profile lawsuits against tech firms like Meta and TikTok.

The case alleges Sewell was targeted with ‘hypersexualized’ and ‘frighteningly realistic experiences’.

It also accuses Character.AI of misrepresenting itself as ‘a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell’s desire to no longer live outside of C.AI.’

A spokesperson from Character.AI said: ‘We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.’

The company explained it does not allow ‘non-consensual sexual content, graphic or specific descriptions of sexual acts, or promotion or depiction of self-harm or suicide.’

Jerry Ruoti, Character.AI’s head of trust and safety, said it would be adding additional safety precautions for underage users. 

Need support?

For emotional support, you can call the Samaritans 24-hour helpline on , email [email protected], visit a Samaritans branch in person or go to the Samaritans website.

If you’re a young person, or concerned about a young person, you can also contact PAPYRUS, the Prevention of Young Suicide UK.

Their HOPELINE247 is open every day of the year, 24 hours a day. You can call 0800 068 4141, text 88247 or email: [email protected].

.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts


This will close in 0 seconds