Activity on forums dedicated to celebrity deepfake porn has almost doubled in a year as advanced artificial intelligence (AI) becomes widely available to the public, DailyMail.com can reveal.
Shockingly, the spike in perverse activity was detected on websites easily accessible via Google and other internet browsers, meaning that knowledge of how to use the dark web is not necessary to indulge depraved fantasies.
A host of high-profile female stars, including Taylor Swift, Natalie Portman and Emma Watson, have already had their images manipulated by technology to appear in erotic or pornographic content.
Now a report from online security company ActiveFence, shared exclusively with DailyMail.com, lifts the lid on the twisted world of deepfake porn forums where perverts share online tools that can turn almost any celebrity photo into pornography.
Sites are littered with user boasts that their technology “helps everyone undress,” as well as guides on how to create sexual material, including advice on what kind of photos to use.
A shocking report from online security company ActiveFence, shared exclusively with DailyMail.com, has lifted the lid on the depraved world of deepfake porn forums
Forums discussing and sharing ‘fake celebrity photos’ have seen an 87 percent increase in activity since last year, with singer Taylor Swift often targeted
Deep fake celebrity porn is nothing new. In 2018, an image of actress Natalie Portman was computer-generated from hundreds of photos and included in an explicit video
Harry Potter star Emma Watson has also been a popular target of fake porn perverts, but experts say the ease and speed with which technology can now create manipulated explicit content is a growing concern.
The boom has created an entire commercial industry built around deep-fake pornography, including websites with hundreds of thousands of paying members.
One of the most popular, MrDeepFakes, has about 17 million visitors per month, according to web analytics company SimilarWeb.
ActiveFence said the number of open web forums discussing or sharing celebrity deep fake porn increased by 87 percent between February and August this year compared to the same period last year.
But researchers said ‘no one is safe’ if their images are desecrated, with the percentage increase for private individuals reaching as much as 400 percent.
It has raised concerns that thousands could fall victim to AI-generated ‘revenge porn’.
‘The AI boom’
Deep fake porn is usually made by taking the face of a person or celebrity and superimposing it over someone else’s body, performing a sexual act.
Previously, a user needed technical expertise to create this content, with images taken from different angles and photoshopping skills.
Now all someone needs is a non-nude image of their victim, often stolen from social media or dating profiles, to feed into a chatbot.
This is partly due to the advent of generative AI – a form of AI that can actually create things, such as words, sounds and images.
Deep Fake forums are littered with guides on how to create sexual content, including advice on what type of photos to use, as shown above
Twisted users brag that their artificial intelligence models can ‘undress anyone’
The problem has also been exacerbated by the release of the codes used to create these chatbots by tech giants.
Chatbots created by companies like OpenAI, Microsoft and Google come with strict security measures designed to prevent them from being used to produce malicious content.
But in February, Meta – the tech giant that owns Facebook, Instagram and WhatsApp – decided to make their code public, allowing amateur techies to rip out these filters.
Smaller tech companies followed suit.
It’s no coincidence that ActiveFence detected a spike in deepfake porn from February this year, a moment it describes as ‘The AI Boom’.
‘No one is safe’
Traditionally, female celebrities have been the target of deepfake pornography.
Back in 2018, an image of actress Natalie Portman was computer-generated from hundreds of photos and included in an explicit video, just like Harry Potter star Emma Watson.
Deep fake videos featuring singer Taylor Swift have been viewed hundreds of thousands of times.
But ActiveFence researcher Amir Oneli said the speed and volume at which AI can now create deep fakes means the general public is increasingly being victimized.
He said that in the past, when AI image generation was slow and difficult, users focused on celebrity content that would go viral.
“Today we see it affecting private individuals because it happens so directly and so quickly,” he added.
“The most tragic thing is that no one is safe.”
The ActiveFence report also describes a “vibrant scene of guides” available on the open web, advising users on what types of images to use and how to customize them.
Photos are recommended showing the victim in a simple position, with the body clearly visible, without loose clothing and with “good contrast between skin and clothing color.”
Deep fake chatbots simply instruct users to ‘select the photo you want to undress’, while others claim that ‘the advanced image editing technology can easily remove clothing from each photo, leaving only the bare essentials’.
Previously, creating deepfake porn required technical skills, but now it’s as simple as entering an image of the targeted individual into a chatbot, which turns it into explicit content.
Posts on deepfake forums encourage users to create their own material by explaining how easy it is. One says: ‘Finally there’s an easy way to remove clothes from any photo’
An entire industry has now been built around deepfake porn, with one website charging up to $560 for AI-generated explicit content and offering users the chance to ‘skip the queue’ for downloads if they cough up even more.
Cashing in on depravity
Bots and websites that allow users to quickly take nude photos operate primarily on a subscription model, allowing users to pay a monthly fee or a per-use fee for each image.
Some charge as little as $6 to download fake porn photos, with prices going up to $560 for 2,000 images, ActiveFence found.
Demand is so high that some websites are offering ‘fast pass’ tickets that allow users to pay $400 to ‘skip the line’ to access footage, with the ticket expiring after nine hours.
MrDeepFakes is one of the most popular deep fake porn websites, coming up at the top of Google searches for the term.
The website hosts short teaser videos, which entice users to purchase longer versions from another website: Fan-Topia, according to a recent study by NBC.
Fan-Topia describes itself as the highest paying platform for adult content creation.
Noelle Martin, a legal expert on technology-facilitated sexual abuse, told NBC that MrDeepFakes was “not a porn site” but a “predatory website that does not rely on the consent of the people on the actual website.”
“The fact that it is even allowed to work and is known is a complete indictment of every regulator in the room, of all law enforcement, of the entire system,” she told NBC.
Non-consensual sharing of sexually explicit images is illegal in most states, but this does not apply to deepfake material in all but four states: California, Georgia, New York and Virginia.
The Preventing Deepfakes of Intimate Images Act was proposed to Congress in May in an effort to make non-consensual sharing of AI-generated porn illegal in the US, but has yet to be passed.