Experts warn Taylor Swift’s nude deepfakes scandal was caused by ‘too little, too late’ attitude towards AI – as Senate is only NOW considering bill to address problem

Researchers have blasted US officials for not introducing stricter AI regulations before pop star Taylor Swift fell victim to deepfakes.

Images showing the four times Grammy winner in series of sex acts while dressed in Kansas City Chief memorabilia and in stadium – and share of pornography – viewed Online 47 million times before it was deleted.

A professor at George Washington University Law School said that if “good legislation had been passed years ago,” Swift and others would not have experienced such abuse.

β€œWe are too little, too late at this point,” said Mary Anne Franks.

β€œIt won’t just be the 14-year-old girl or Taylor Swift. They will be politicians. They will be world leaders. It’s going to be an election.’

Non-consensual, sexually explicit deepfake images of Taylor Swift circulated on social media and were viewed 47 million times before being deleted

A group of teenage girls became the target of deepfake images at a New Jersey high school when their male classmates began sharing nude photos of them in group chats.

On October 20, one of the boys in the group chat talked about it with one of his classmates, who brought it up to the school management.

‘My daughter texted me: ‘Mom, naked pictures of me are being spread.’ That is it. On the way to the principal’s office,” one mother said CBS News.

She added that her daughter, who is 14, “started crying, and then she walked down the halls and saw other girls from Westfield High School crying.”

But it wasn’t until deepfake photos of Taylor Swift went viral that lawmakers urged action.

β€œ

A 404 media report revealed that the images may have come from a group on Telegram after users reportedly joked about how Swift’s images went viral.

X said The teams took “appropriate action” against the accounts that posted the deepfakes, saying they were monitoring the situation and removing the images.

Last week, US senators introduced the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act), shortly after Swift fell victim to the technology.

β€œWhile the images may be fake, the harm to victims from the spread of sexually explicit ‘deepfakes’ is very real,” Senate Majority Whip Dick Durbin (D-Illinois) said last week.

‘Victims have lost their jobs and may suffer from persistent depression or anxiety.

“By introducing this legislation, we are putting power back in the hands of victims, cracking down on the spread of ‘deepfake’ images and holding those responsible for the images accountable.”

Lawmakers proposed the Defiance Act, which would allow people to sue those who created deepfake content of them

Lawmakers proposed the Defiance Act, which would allow people to sue those who created deepfake content of them

Politicians introduced the Prevention of Deepfakes of Intimate Images Act last year – which would make it illegal to share non-consensual deepfake pornography – but it has not yet been passed.

“If legislation had been passed years ago where proponents said this is what would happen with this kind of technology, we might not be in this position,” said Franks, a professor at George Washington University Law School and president of the Cyber. ​Civil. Rights initiative, narrated Scientific American.

Franks said lawmakers are doing too little too late.

β€œWe can still try to mitigate the disaster that is unfolding,” she explained.

Women are β€œcanaries in the coal mine,” Franks said, talking about how AI disproportionately affects the female population.

She added that β€œat the end of the day, it’s not just going to be the 14-year-old girl or Taylor Swift. They will be politicians. They will be world leaders. It’s going to be an election.’

A 2023 study found that there has been a 550 percent increase in the creation of fake images over the past five years, with 95,820 deepfake videos posted online last year alone.

A Dailymail.com/TIPP poll shows that 75 percent of people agree that people who share deepfake pornographic images online should be criminally prosecuted.

Deepfake technology uses AI to manipulate a person’s face or body, and no federal laws currently exist to protect people from sharing or creating such images.

Rep. Joseph Morelle (D-New York), who unveiled the Preventing Deepfake of Intimate Images Act, called on other lawmakers to step up and take urgent action against the increasing number of deepfake images and videos.

Images and videos “can cause irrevocable emotional, financial and reputational damage,” Morelle said, adding: “And unfortunately, women are disproportionately affected.”

75 percent of people agree that people who share deepfake pornographic images online should be criminally prosecuted

75 percent of people agree that people who share deepfake pornographic images online should be criminally prosecuted

But for all their talk, there are still no solid guardrails to protect Americans from falling victim to non-consensual deepfake images or videos.

β€œIt’s clear that AI technology is advancing faster than the necessary guardrails,” said Congressman Tom Kean Jr., who proposed the AI ​​Labeling Act last November.

The law would require AI companies to add labels to all AI-generated content and force them to take responsible steps to prevent the publication of non-consensual content.

β€œWhether the victim is Taylor Swift or a young person in our country – we must implement safety measures to combat this alarming trend,” Kean said.

However, there is one big problem in all the legislative hoopla: who should charge a crime once a law is passed criminalizing deepfakes.

It’s highly unlikely that the person who made the mistake will take action and identify themselves, and forensic studies can’t always identify and prove which software created the content, said Amir Ghavi, chief AI consultant at law firm Fried Frank.

And even if law enforcement were able to determine where the content came from, they might be prohibited from taking action under Section 230, which states that websites are not responsible for what users post.

Either way, the potential barriers aren’t slowing down politicians in the wake of Swift’s run-in with sexually explicit deepfake content.

β€œNo one – neither celebrities nor everyday Americans – should ever be featured in AI pornography,” said Senator Josh Hawley (R-Missouri).

Speaking about the Defiance Act, he said: ‘Innocent people have the right to defend their reputations and hold perpetrators accountable in court. This bill makes that a reality.”