Mom of girl, 14, who was victim of deep fake porn at school demands sharing be made a federal crime: Joins battle with Congress to make AI regulated after Taylor Swift ‘nudes’ spread online

A concerned mother whose 14-year-old daughter was the target of fake nude photos just like Taylor Swift is working with lawmakers from across the political spectrum to demand Congress enact AI regulations.

“It’s clear that AI technology is advancing faster than the necessary guardrails,” said New Jersey Republican Rep. Tom Kean.

“Whether the victim is Taylor Swift or a young person in our country – we must take precautions to combat this alarming trend,” he said as he called on Congress to pass his AI regulation bill.

The singer is the latest target of the website Celeb Jihad, which is flouting the state’s porn laws and continuing to evade cybercrime squads.

Celeb Jihad is believed to be the origin of dozens of recent graphics depicting Swift at a Kansas City Chiefs game. The footage, created using AI software, was shared by the website on January 15 under the headline: ‘Taylor Swift Chiefs Sex Scandal Caught On Camera’.

Similarly, male students at Westfield High School in New Jersey’s Kean County were caught with explicit AI-generated images of their female classmates in November.

According to 14-year-old Francesca Mani, who was the victim of the false images, and her mother Dorota, the boy involved was back at school after a two-day suspension.

They filed a complaint with the police, but with no laws on the books to regulate AI-generated art, the complaint was quickly closed.

“There was very little to no accountability,” Dorota Mani told DailyMail.com. ‘There is no AI legislation – the school has not updated its AI policy on cyber harassment, it was like a broken record,’ advice is available.’ That doesn’t help anything.’

A slew of viral fake images claiming to be nude photos of Taylor Swift have sparked new calls from Congress to enact AI regulations

Francesca Mani, 14, (right) with her mother Dorota (left). Mani, 14, who attends Westfield High School, said the presence of the boy who took explicit fake images of her makes her “very uncomfortable and scared” after she discovered that fake nude photos of herself had been distributed to students in her class on Snapchat

“If that’s not negligence, I really don’t know what is,” she said. The mother-daughter duo have become a public face of AI regulation advocacy and will head to the White House in February to make the case.

‘Everyone asks how we feel. No one asks: How do the boys feel now that they have so little responsibility?’ the mother continued.

“You know, how do the platforms that connect these transactions, like AMEX and PayPal and Chase and Visa, sit awake knowing that they’re allowing these transactions? No one wonders how Google, Amazon and Microsoft sleep at night and how they feel knowing they host the content?”

Swift has been a regular at Chiefs games since she went public with her romance with star player Travis Kelce.

Non-consensual deepfake pornography is illegal in Texas, Minnesota, New York, Virginia, Hawaii and Georgia. In Illinois and California, victims can sue the creators of pornography in court for defamation. But it is not prohibited by federal law.

“Intimate deepfake images like those targeting Taylor Swift are disturbing, and unfortunately they are becoming increasingly widespread on the internet. I am shocked that this type of sexual exploitation is not a federal crime,” Rep. Joe Morelle, D-N.Y., said in a statement to DailyMail.com.

More than 90% of such fake images, known as ‘deep fakes’, are porn, according to image detection company Sensity AI.

A bipartisan pair of lawmakers are pushing their bill to curb ‘deepfake’ AI-generated images after explicit fake images of Taylor Swift went viral

Earlier this week, a wave of hoax robocalls impersonating President Joe Biden during the New Hampshire primary advised voters not to vote and told them it is “important that you save your vote for the November elections.”

But the Westfield High incident prompted area lawmakers to demand a ban on the making of such images.

“Try to imagine the horror of receiving intimate images that look exactly like you — or your daughter, or your wife, or your sister — and you can’t prove that they don’t,” Morelle said. “Deepfake pornography is sexual exploitation, it’s offensive, and I’m surprised it’s not already a federal crime.”

Kean and Morelle have launched a bipartisan bill that would require AI generators to “display disclosure in a prominent manner to clearly identify content generated by AI” and would create working groups to find best practices to protect AI-generated content identify and make public.

The Swift scandal is the latest of many involving Celeb Jihad, which was created by its anonymous founder in 2008.

Amazingly, the site claims that the content is ‘satire’ and that it is ‘not a pornographic website’.

Swift’s legal team previously alerted Celeb Jihad in 2011 after it published a doctored photo showing the singer topless. The photo appeared with the caption “Taylor Swift Topless Private Pic Leaked?”

At the time, its lawyers threatened to file a trademark infringement suit, accusing the company of spreading “fake pornographic images” and “false news.” It appears the website has published hundreds, and possibly thousands, more doctored images of Swift since its inception.

Celeb Jihad was also involved in several large-scale leaks in 2017 of private photos hacked from celebrities’ cellphones, including iCloud accounts.

The website was one of several that published illegally obtained photos of celebrities including Miley Cyrus and Tiger Woods and his ex-girlfriend Lindsey Vonn. Vonn said the leak was an “outrageous and despicable invasion of privacy.”

Related Post