REVEALED: Deepfake porn creators and publishers got away with producing 415,000 images last year – and making the warped pictures remains LEGAL in 44 states

More than 415,000 pornographic deepfake images were uploaded to the internet last year – but only six states have banned the creation and distribution of such content.

This inability of the law to keep pace with the real world exposes how the perpetrators who create false sexual images, and the websites that publish them, go largely unpunished.

Meanwhile, victims who are pictured without consent have their lives turned upside down and must fight to have the degrading content removed from the internet, which can be virtually impossible.

The issue of the rise of deepfake porn received renewed attention this week after AI-generated explicit images of Taylor Swift were published by a porn website and shared widely on X. The creation and distribution of the content was likened to a form of sexual assault.

But it’s not just celebrities who are depicted in these fake images, which can be easily created using online AI tools, for which a small fee is charged. Mothers and schoolgirls have also fallen victim to this cruel trend.

The issue of the rise of deepfake porn received renewed attention this week after AI-generated explicit images of Taylor Swift were published by a porn website and shared widely on

A 14-year-old New Jersey girl who was depicted in a pornographic deepfake image created by one of her male classmates is now leading a campaign to get a federal law passed.  Francesca Mani (right) and her mother, Dorota Mani (left), have criticized the lack of legislation

A 14-year-old New Jersey girl who was depicted in a pornographic deepfake image created by one of her male classmates is now leading a campaign to get a federal law passed. Francesca Mani (right) and her mother, Dorota Mani (left), have criticized the lack of legislation

So how come perverts with only limited technical skills can create and share this content with impunity?

The answer largely lies in the lack of laws to prosecute those who create such content.

There is currently no federal law against this behavior, and only six states – New York, Minnesota, Texas, Hawaii, Virginia and Georgia – have passed legislation criminalizing it.

Texas passed a law in September 2023 that makes it a crime to create or share deepfake images without consent that “depict the person with the person’s intimate parts visible or engaged in sexual conduct.”

The offense is a Class A misdemeanor and the penalties include up to one year in jail and fines up to $4,000.

In Minnesota, this crime is punishable by up to three years in prison and fines of up to $5,000.

Several of these laws were introduced in response to previous legislation that banned the use of deepfakes to influence elections, for example by creating fake images or videos depicting a politician or government official.

A handful of other states, including California and Illinois, have no laws against this act, but instead allow deepfake victims to sue the perpetrators. Critics have said this does not go far enough and that in many cases the creator is unknown.

The finding that 415,000 deepfake images were posted online last year was made by Genevieve Oh, a researcher who analyzed the top ten websites hosting such content.  Oh also found that 143,000 deepfake videos were uploaded in 2023

The finding that 415,000 deepfake images were posted online last year was made by Genevieve Oh, a researcher who analyzed the top ten websites hosting such content. Oh also found that 143,000 deepfake videos were uploaded in 2023

At the federal level, Joe Biden signed an executive order in October calling for a ban on the use of generative AI to create child abuse images or non-consensual “intimate images” of real people. But this was purely symbolic and does not create a means to punish creators.

The finding that 415,000 deepfake images were posted online last year was made by Genevieve Oh, a researcher who analyzed the top ten websites hosting such content.

Oh also found that 143,000 deepfake videos were uploaded in 2023 – more than in the previous six years combined. The videos, published on 40 different websites hosting fake videos, were viewed more than 4.2 billion times.

Outside of states where laws exist that criminalize the behavior, victims and prosecutors must rely on existing laws that can be used to charge offenders.

These include laws surrounding cyberbullying, extortion and harassment. Victims who are being blackmailed or repeatedly abused may try to use these laws against perpetrators who use deepfake images as a weapon.

But they don’t ban the basic act of taking a hyper-realistic, explicit photo of a person and then sharing it with the world without their consent.

A 14-year-old New Jersey girl who was depicted in a pornographic deepfake image created by one of her male classmates is now leading a campaign to get a federal law passed.

Francesca Mani and her mother, Dorota Mani, recently met with members of Congress on Capitol Hill to push for laws against perpetrators

Francesca Mani and her mother, Dorota Mani, recently met with members of Congress on Capitol Hill to push for laws against perpetrators

Francesca Mani and her mother, Dorota Mani, recently met with members of Congress on Capitol Hill to push for laws against perpetrators.

Francesca, a student at Westfield High School, and several of her classmates were pictured in the images that surfaced in October.

The boy who took the photos was suspended but returned to school just days later. A report has also been made to the police, but no action is believed to have been taken.

Francesca told me News Nation: ‘What happened to me was not okay. On October 20, me and a few other classmates had AI nudes of other classmates.

‘At first I felt helpless and then I got angry about the lack of legislation and the lack of AI school policies.

“But now I’ve been able to talk to people in Congress today and I feel super strong because I know I’m helping to make a change.”

Her mother added that current legislation is “not enough” because “AI is not mentioned” in existing laws at the federal level.