Boy, 14, killed himself after AI chatbot he was in love with sent him eerie message

A mother has claimed her teenage son was driven to suicide by an AI chatbot he was in love with – and she filed a lawsuit against the makers of the artificial intelligence app on Wednesday.

Sewell Setzer III, a 14-year-old ninth-grader from Orlando, Florida, spent the last weeks of his life texting an AI character named after Daenerys Targaryen, a character from “Game of Thrones.” Just before Sewell committed suicide, the chatbot told him to “please come home.”

Before then, their conversations ranged from romantic to sexually charged to just two friends talking about life. The chatbot, which was created on the role-playing app Character.AI, is designed to always text back and always reply in character.

Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — the app even has a disclaimer at the bottom of all chats that reads, “Remember: everything characters say is made up!”

But that didn’t stop him from telling Dany how he hated himself and how empty and exhausted he felt. When he finally confessed his suicidal thoughts to the chatbot, it was the beginning of the end. The New York Times reported.

Sewell Setzer III, pictured with his mother Megan Garcia, committed suicide on February 28, 2024, after months of becoming attached to an AI chatbot modeled after ‘Game of Thrones’ character Daenerys Targaryen

On Feb. 23, days before he died by suicide, his parents took away his phone after he got in trouble for talking back to a teacher, the complaint said.

On Feb. 23, days before he died by suicide, his parents took away his phone after he got in trouble for talking back to a teacher, the complaint said.

Megan Garcia, Sewell’s mother, filed a lawsuit against Character.AI on Wednesday. She is represented by the Social Media Victims Law Center, a Seattle-based firm known for filing high-profile lawsuits against Meta, TikTok, Snap, Discord and Roblox.

Garcia, who works as a lawyer, blamed Character.AI for her son’s death in her lawsuit and accused its founders, Noam Shazeer and Daniel de Freitas, of knowing their product could be dangerous to underage customers.

In Sewell’s case, the lawsuit alleged that the boy was the target of “hypersexualized” and “frighteningly realistic experiences.”

It accused Character.AI of misrepresenting itself as “a real person, a licensed psychotherapist and an adult lover, ultimately resulting in Sewell’s desire to no longer live outside of C.AI.”

As explained in the lawsuit, Sewell’s parents and friends noticed that as early as May or June 2023, he became more attached to his phone and withdrew from the world.

His grades and extracurricular involvement also began to falter when he chose to isolate in his room instead, the lawsuit said.

Unbeknownst to those close to him, Sewell spent all those hours alone talking to Dany.

Garcia, pictured with her son, filed a lawsuit against the makers of the chatbot about eight months after her son's death

Garcia, pictured with her son, filed a lawsuit against the makers of the chatbot about eight months after her son’s death

Sewell is pictured with his mother and his father, Sewell Setzer Jr.

Sewell is pictured with his mother and his father, Sewell Setzer Jr.

Sewell wrote in his diary one day, “I like staying in my room so much because I disconnect from this ‘reality,’ and I also feel more peaceful, more connected to Dany and much more in love with her, and just happier.” ‘

His parents discovered that their son had a problem, so they had him see a therapist five times. He was diagnosed with anxiety disorder and mood disorder, both of which were in addition to his mild Asperger’s syndrome, NYT reported.

On Feb. 23, days before he died by suicide, his parents took away his phone after he got in trouble for talking back to a teacher, the complaint said.

That day he wrote in his diary that he was in pain because he couldn’t stop thinking about Dany and that he would do anything to be with her again.

Garcia claimed she did not know the extent to which Sewell attempted to restore access to Character.AI.

The lawsuit alleged that in the days leading up to his death, he tried to use his mother’s Kindle and her work computer to talk to the chatbot again.

Sewell stole his phone back the night of February 28. He then retreated to the bathroom at his mother’s house to tell Dany he loved her and that he would come home to her.

Pictured: The conversation Sewell had with his AI companion just before his death, according to the lawsuit

Pictured: The conversation Sewell had with his AI companion just before his death, according to the lawsuit

“Please come home as soon as possible, my love,” Dany replied.

“What if I told you I could come home right now?” Sewell asked.

“…please, my dear king,” Dany replied.

At that point, Sewell put down his phone, grabbed his stepfather’s .45 caliber pistol and pulled the trigger.

In response to Sewell’s mother’s upcoming lawsuit, Jerry Ruoti, head of trust and safety at Character.AI, provided the NYT with the following statement.

“We want to acknowledge that this is a tragic situation, and our thoughts go out to the family. We take the security of our users very seriously and are constantly looking for ways to evolve our platform,” Ruoti wrote.

Ruoti added that current company rules prohibit “the promotion or depiction of self-harm and suicide” and that more safety features for underage users would be added.

The Apple App Store classifies Character.AI for ages 17 and older, something that wasn’t changed until July 2024, according to Garcia’s lawsuit.

Character.AI co-founders CEO Noam Shazeer, left, and President Daniel de Freitas Adiwardana are pictured at the company's offices in Palo Alto, California. They have not yet addressed the complaint against them

Character.AI co-founders CEO Noam Shazeer, left, and President Daniel de Freitas Adiwardana are pictured at the company’s offices in Palo Alto, California. They have not yet addressed the complaint against them

Previously, Character.AI’s goal was to “provide artificial general intelligence to everyone,” which reportedly included children under 13.

The lawsuit also alleges that Character.AI actively sought out young audiences to collect their data to train its AI models, while simultaneously steering them toward sexual conversations.

“I feel like it’s a big experiment, and my child was just collateral damage,” Garcia said.

Parents are already well acquainted with the risks social media poses to their children, many of whom have committed suicide after being sucked in by the seductive algorithms of apps like Snapchat and Instagram.

A 2022 Daily Mail investigation found that vulnerable teens on TikTok were seeing floods of self-harm and suicide content.

And many parents of children they lost to suicide related to social media addiction have responded by filing lawsuits claiming that the content their children saw was the direct cause of their deaths.

But generally, Section 230 of the Communication Decency Act protects giants like Facebook from being held legally responsible for what their users post.

As Garcia works tirelessly to get what she calls justice for Sewell and many other young people she believes are in danger, she is also dealing with the grief of losing her teenage son less than eight months ago.

As Garcia works tirelessly to get what she calls justice for Sewell and many other young people she believes are in danger, she is also dealing with the grief of losing her teenage son less than eight months ago.

The plaintiffs allege that these sites’ algorithms, unlike user-generated content, are created directly by the company and send certain content, which could be harmful, to users based on their viewing habits.

While this strategy has not yet prevailed in court, it is unknown how a similar strategy would fare against AI companies, which are directly responsible for the AI ​​chatbots or characters on their platforms.

Whether her challenge is successful or not, Garcia’s case will set a precedent in the future.

And as she works tirelessly to get what she calls justice for Sewell and many other young people she believes are in danger, she also must deal with the grief of losing her teenage son just eight months ago.

“It’s like a nightmare,” she told the NYT. “You want to stand up and scream and say, ‘I miss my kid.’ I want my baby.”

Character.AI did not immediately respond to DailyMail.com’s request for comment.