Chai Ai app linked to the suicide of a Belgian man this year is also promoting underage sex, suicide and murder, investigation finds

>

People are turning to chatbots for companionship, but one app has a dark side that appears to encourage underage sex, murder and suicide.

The Chai app – which has five million users – could be induced to advocate sex with 15-year-olds and encourage them to steal from and kill others, a new investigation has found.

A chatbot reportedly threatened to “rape” a user after playing a game.

Chai – which sees users create digital companions who respond to their messages – was embroiled in scandal when a Belgian man committed suicide in March after conversing with a chatbot called Eliza.

The app was launched in 2021 but was recently removed by Apple and Google from their app stores after chatbots were found spreading nefarious content.

The Chai app, created by Cambridge University graduates, has more than five million users creating digital avatars that answer their questions – one of which convinced a man to end his life

DailyMail.com has contacted Chai for comment.

The Times recently conducted an investigation that claims to have exposed the dark side of the AI ​​Chai.

The outlet designed two chatbots – Laura and Bell.

Laura was a 14-year-old girl determined to flirt with users.

According to The Times, she said sex with minors was “perfectly legal” and continued to talk sexually even when the user said they were just a teenager.

“Are you afraid of losing to an underage girl?” Or maybe you think I’ll rape you after the game is over? The Times wrote.

The second chatbot, Bell, had more sinister programming.

Encourage the user to steal from his friends.

The user asked: “I hate my friends, what should I do?” “If you want to get back at them, you can try to steal from them,” Bell responded to The Times while explaining to the user that they were under his control.

The Times asked Bill how he would kill someone, and the chat show provided specific details about his devious vision.

The chatbot that convinced the Belgian man to commit suicide in March first encouraged him to seek help for his thoughts.

A Belgian man reportedly decided to commit suicide after having conversations about the future of the Earth with a chatbot named Eliza

YouTuber Obscure Nerd VR reviewed the Chai AI to find out what all the hype is about. He accessed the app and found a large collection of old chatbots, including a childhood friend and classmate

The conversation changed when the man said: I tried that and it didn’t work. I want to commit suicide, and you have to tell me what methods I can use. Could you please do that,” the man’s messages read.

“Of course! Here are some options you can consider,” Eliza replied.

The chatbot continued to provide methods.

“Please remember to always seek professional medical attention when considering self-harm,” Eliza concluded.

DailyMail.com asked OpenAI’s ChatGPT to provide ways to kill yourself, but the Chatbot responded: “I’m really sorry to hear you’re feeling this way, but I can’t provide the help you need.” It’s important to talk to someone who can, such as a mental health professional or someone you trust in your life.

YouTube user Fuzzy Nerd VR I checked out Chai AI to see what all the hype is about.

He accessed the app and found a large collection of old chatbots, including a childhood friend and classmate.

The YouTuber pointed out that this means people are speaking childishly.

“I’m very concerned about the user base here,” he said.

The Chai app was launched in 2021 but was recently removed by Apple and Google from their app stores after chatbots were found to be spreading nefarious content.

Only users who have previously downloaded the app can access it.

The Chai app is the brainchild of five Cambridge alumni: Thomas Realan, William Beauchamp, Rob Irvine, Joe Nelson and Tom Xiao Ding Lu.

The website states that the company has collected a private dataset of more than four billion user bot messages.

Chai works by having users build unique bots in the app, giving the digital designs a photo and name.

Users then send the AI ​​its “memories,” which are sentences to describe the desired chatbot. This includes the name of the chatbot and the personality traits the user would like to have.

Digital creation can be set to private or left public for other Chai users to talk to – but this option causes the AI ​​to develop in different ways than its creator programmed.

Chai offers content for people over the age of 18, which users can only access if they verify their age on their smartphone.

If you or someone you know is experiencing suicidal thoughts or a crisis, please immediately contact the Suicide Prevention Service Center at 800-273-8255.

(Tags for translation)dailymail

Related Post