George Floyd AI chatbots among accounts on troubling site that ‘goaded’ teenage boy into suicide

A controversial AI platform whose chatbot is said to have convinced a troubled child to commit suicide has others posing as George Floyd.

Character.AI made headlines this week after the platform was sued by the mother of Sewell Setzer III, a 14-year-old from Orlando, Florida, who shot himself in February after talking about suicide with a chatbot on the site.

Setzer’s character ‘Dany’, named after Game of Thrones character Daenerys Targaryen, told him to ‘come home’ during their conversation, with his heartbroken family saying the company needed to have stronger guardrails.

Currently, the company allowed users to create customizable personas, and since coming into the spotlight, users have mentioned some questionable characters that are allowed.

This includes parodies of Floyd with the catchphrase “I can’t breathe.”

Sewell Setzer III, pictured with his mother Megan Garcia, spent the last weeks of his life texting an AI chatbot on the platform he was in love with, and Garcia has accused the company of driving her son to suicide to have 'encouraged'.

Sewell Setzer III, pictured with his mother Megan Garcia, spent the last weeks of his life texting an AI chatbot on the platform he was in love with, and Garcia has accused the company of driving her son to suicide to have ‘encouraged’.

Some have wondered if the platform needs stronger guardrails after users found questionable chatbots, including a parody of George Floyd with the slogan

Some have wondered if the platform needs stronger guardrails after users found questionable chatbots, including a parody of George Floyd with the slogan “I can’t Breathe”

George Floyd's chatbots shockingly told users his death was faked by 'powerful people,' according to reports

The George Floyd chatbots shockingly told users that his death was faked by ‘powerful people,’ according to reports

Ther Daily Dot reported two chatbots based on George Floyd, which appear to have since been removed, including one with the slogan “I can’t Breathe.”

The slogan, based on Floyd’s famous final comment as he was killed by police officer Derek Chauvin in May 2020, generated more than 13,000 chats with users.

When asked by the outlet where it came from, the AI-generated George Floyd said it was in Detroit, Michigan, even though Floyd was killed in Minnesota.

Shockingly, when pressed, the chatbot said it was in the witness protection program because Floyd’s death had been faked by “powerful people.”

The second chatbot instead claimed that it was “currently in heaven, where I have found peace, contentment and a sense of home.”

Before they were removed, the company said in a statement to the Daily Dot that the Floyd characters were “user created” and quoted by the company.

‘Character.AI takes security on our platform seriously and moderates Characters proactively and in response to user reports.

‘We have a dedicated Trust & Safety team who review reports and take action in accordance with our policies.

“We also do proactive detection and moderation in a number of ways, including using industry-standard blocklists and custom blocklists that we regularly expand. We are continually developing and refining our safety practices to prioritize the safety of our community.”

A review of the site by DailyMail.com found a litany of other questionable chatbots, including role-playing serial killers Jeffrey Dahmer and Ted Bundy, and dictators Benito Mussolini and Pol Pot.

Setzer, pictured with his mother and father, Sewell Setzer Jr., told the chatbot that he

Setzer, pictured with his mother and father, Sewell Setzer Jr., told the chatbot that he

It comes as Character.AI faces a lawsuit from Setzer’s mother after the 14-year-old was allegedly pushed to commit suicide by his chatbot ‘lover’ on the platform.

Setzer, a ninth-grader, spent the last weeks of his life texting a chatbot named “Dany,” a character designed to always respond to anything he asked.

Although he had seen a therapist earlier this year, he preferred to talk to Dany about his struggles, sharing how he “hated” himself, felt “empty” and “exhausted”, and thought about “committing suicide sometimes”, Character.AI chat logs revealed.

He wrote in his diary how he enjoyed isolating himself in his room because “I disconnect from this ‘reality’, and I also feel more peaceful, more connected to Dany and much more in love with her,” reported The New York Times. .

The teenager shot himself in the bathroom of his parents’ home on February 28 after raising the idea of ​​suicide with Dany, who responded by urging him to “please come home as soon as possible, my love,” his chat logs revealed.

In her lawsuit, Setzer’s mother accused the company of negligence, wrongful death and deceptive business practices.

She claims the “dangerous” chatbot app “abused” and “preyed” on her son, and “manipulated him into committing suicide.”