Deepfake nude image of 16-year-old Jenna Ortega used ads on Facebook and Instagram as celebrities continue to be violated by AI-generated content

Facebook and Meta ran ads for an AI app that used deepfake nude photos of an underage Jenna Ortega.

The “Wednesday” star was targeted by the Perky AI app, which marketed itself as a way to capture sexually explicit images of anyone using artificial intelligence.

According to the software, the software displayed at least 11 advertisements on social media platforms last month NBC. At least one showed a blurred image that appeared to show a topless Ortega, based on a photo taken when she was 16 years old.

The $7.99-per-week app demonstrated to users how to create fake nudes of real people using prompts like “no clothes,” “latex costume” and “Batman underwear.”

A description of the app on the Apple Store details how users can “enter a prompt to make them look and dress the way you want.”

It’s the latest controversy over celebrity deepfakes spreading on the internet. It comes months after AI-generated pornographic images of Taylor Swift went viral.

Facebook and Meta published ads for an AI app that used deepfake nude photos of an underage Jenna Ortega

The ‘Wednesday’ star was targeted by the Perky AI app, which marketed itself as a way to capture sexually explicit images of anyone using artificial intelligence

The software ran at least 11 ads on the Meta platforms last month before being pulled from Facebook and Instagram, NBC reports

The AI-powered app ran more than 260 different ads on Meta since September, 30 of which were removed by the social media company for violating its terms.

One of the ads by Ortega, now 21, was viewed more than 2,600 times, NBC reports.

The ads were removed by Meta and Apple after the news company reached out to flag them.

Meta generates 95 percent of its revenue from advertising and will rake in more than $131 billion by 2023.

Perky AI listed its developer as RichAds, a Cyprus-based “global self-service ad network” that creates push ads, according to its website.

In addition to Ortega, singer Sabrina Carpenter was also targeted in several advertisements.

Carpenter is currently on tour in support of Taylor Swift, another celebrity who has fallen victim to the worrying deepfake trend.

Swift was “furious” about the AI ​​images circulating online and was considering legal action against the sick deepfake porn site she hosts.

The singer was the latest target of the website, which flouts state porn laws and continues to evade cybercrime squads.

One of the ads by Ortega, now 21, was viewed more than 2,600 times before it was removed

Taylor Swift has been targeted by AI porn and is said to be considering legal action against the website hosting the porn

Singer Sabrina Carpenter was featured in the explicit Perky AI app ads

Dozens of graphic images have been uploaded to Celeb Jihad, showing Swift performing a series of sex acts while dressed in Kansas City Chief memorabilia and in the stadium.

The pornography was viewed 47 million times before it was removed, but is still accessible.

Experts warned that the law in this area is woefully behind and that more and more women and girls could be targeted.

“We are too little, too late at this point,” said Mary Anne Franks, George Washington University law school faculty member.

“It won’t just be the 14-year-old girl or Taylor Swift. They will be politicians. They will be world leaders. It’s going to be an election.’

The pernicious technology began infiltrating schools even before the Swift scandal.

Recently, it emerged that a group of teenage girls at a New Jersey high school were targeted when their male classmates began sharing nude photos of them in group chats.

On October 20, one of the boys in the group chat talked about it with one of his classmates, who brought it up to the school management.

75 percent of people agree that people who share deepfake pornographic images online should be criminally prosecuted

Lawmakers proposed the Defiance Act, which would allow people to sue those who created deepfake content of them

But it wasn’t until deepfake photos of Taylor Swift went viral that lawmakers urged action.

US senators introduced the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act) shortly after Swift fell victim to the technology.

“While the images may be fake, the harm to victims from the spread of sexually explicit ‘deepfakes’ is very real,” Senate Majority Whip Dick Durbin (D-Illinois) said last week.

‘Victims have lost their jobs and may suffer from persistent depression or anxiety.

“By introducing this legislation, we are putting power back in the hands of victims, cracking down on the spread of ‘deepfake’ images and holding those responsible for the images accountable.”

A 2023 study found that there has been a 550 percent increase in the creation of fake images over the past five years, with 95,820 deepfake videos posted online last year alone.

A Ny Breaking.com/TIPP poll shows that 75 percent of people agree that people who share deepfake pornographic images online should be criminally prosecuted.

Deepfake technology uses AI to manipulate a person’s face or body, and no federal laws currently exist to protect people from sharing or creating such images.

“Meta strictly prohibits child nudity, content that sexualizes children, and services that offer non-consensual AI-generated nudity,” Ryan Daniels, a spokesperson for Meta, said in a statement to NBC.

DailyMail.com has contacted Meta and Perky AI for comment.

Related Post