AI-powered ‘Nudify’ apps that digitally undress fully-clothed teenage girls are soaring in popularity

Tens of millions of people use nudify apps powered by artificial intelligence, according to a new analysis that shows the dark side of the technology.

It was visited by more than 24 million people Naked AI sites In September that Digitally modifying photos, especially of women, to make them appear naked in the photo using deep learning algorithms.

These algorithms are trained on existing images of women, allowing them to synthesize realistic images of naked body parts, regardless of whether the person depicted is wearing clothes or not.

sPAM ads across major platforms are also driving people to sites and apps by more than 2,000 percent since the start of 2023.

The rise in apps promoting nudity is particularly prevalent on social media, including YouTube, Reddit and Google's

AI “nudity” apps are becoming increasingly popular, allowing people to “undress” women without their consent

A Telegram spokesperson told Dailymail.com: “Since its inception, Telegram has actively moderated harmful content on its platform, including non-consensual pornography.

“Telegram moderators actively monitor public parts of the platform and accept user reports in order to remove content that violates our Terms of Service.”

DailyMail.com has contacted X and YouTube for comment.

Female students were targeted by fake photos at a New Jersey high school last month after artificial intelligence-generated nude photos were circulated around the school, prompting a mother and her 14-year-old daughter to call for better NCII content protection.

A similar incident occurred at a high school in Seattle, Washington, earlier this year, where a boy allegedly used deep artificial intelligence applications to create images of female students.

In September, more than 20 girls fell victim to fake photos using the AI ​​app “Clothoff,” which allows users to “undress girls for free.”

The report was conducted by Graphika – a social network analysis company – which said it identified key tactics, techniques and procedures used by artificial intelligence NCII service providers to understand how AI-generated nudity sites and apps operate and monetize their activities.

“We assess that the increasing importance and accessibility of these services will likely lead to more instances of online harm, such as the creation and dissemination of non-consensual nude images, targeted harassment campaigns, sexual blackmail, and the generation of child sexual abuse material,” the researchers said in the report.

Graphika found that the apps operate on a so-called “freemium model” that offers a small amount of services for free while keeping enhanced services locked behind a paywall.

To access additional features, users are often required to purchase additional “credits” or “tokens,” with prices ranging from $1.99 per credit to $299.

The report also revealed that advertisements for NCII apps or websites are clear in their descriptions, stating that they offer “undressing” services or posting photos of people “undressing” as evidence.

Other ads are not targeted and hide behind the claim of being an “AI powered technical service” or “web3 photo gallery” but include key terms in line with NCII in their profiles and attached to their posts.

Teenage girls in southern Spain have been targeted by people using the Clothoff app

YouTube tutorial videos telling viewers exactly how to use “nudify” apps or announcing which apps to use are still trending on the platform under a quick search for “AI nudity app.”

While many of the videos warn about the circulating apps, others include the title “Make any photo nude with this AI” or “Generate AI images NSFW (Not Safe for Work).”

The Federal Trade Commission (FTC) raised a complaint With the US Copyright Office on Thursday, to discuss the need to monitor the impact of generative AI to protect consumers.

“The way companies develop and release generative AI tools and other AI products.” . . “Raises concerns about potential harm to consumers, workers, and small businesses,” the commentary said.

“The FTC is exploring the risks associated with the use of artificial intelligence, including consumer privacy violations, automated discrimination and bias, fast charging, deceptive practices, scammer schemes, and other types of fraud,” she added.

There are currently no laws prohibiting deep porn. While President Joe Biden issued Executive order Regarding AI regulation that provides guidelines for monitoring and discovering AI-generated content, it does not include a national law that prohibits it outright.

Psychotherapist Lisa Sanfilippo, whose expertise includes sexual trauma, said: Interested in trade Creating nudity is a “huge invasion” of people's privacy and can cause severe trauma to the victim.

“Seeing photos of yourself — or photos faked to look like you, in actions that you may find reprehensible or creepy or that may be just for your personal life, can be very destabilizing — even traumatizing,” she told the outlet.

“There's no ability to give consent there.”

“It is abuse when someone takes something from another person that was not freely given to them,” she added.

Dailymail.com has reached out to the FTC and Reddit for comment.

(tags for translation)dailymail

Related Post