Congress is pushing for a bill to charge Americans if fake pornographic images of them are published after Taylor Swift’s deep fake scandal

  • The DEFIANCE Act, introduced by a group of four senators, would allow victims of deepfake images and videos to take legal action against those responsible
  • The White House has called on Congress to take legislative action to combat the increased spread of explicit fake images
  • It is not the first time that Taylor Swift is considering legal action against the website where the fake porn images come from

A group of lawmakers are stepping in to catch Taylor Swift’s “deep fake” perpetrators with a bill that would allow Americans to sue if fake porn images of them are published.

Pop star Taylor Swift became the latest target of non-consensual deepfakes after artificial intelligence generated sexually explicit images of her flooded the internet this week.

The dozens of graphic images showed Swift performing a series of sex acts while dressed in Kansas City Chief memorabilia after becoming a regular at football games in support of her boyfriend Travis Kelce.

Swift is now considering legal action against the deepfake porn website that posted the images, amid calls from fans and even the White House for legislative action to combat the growing problem.

Lawmakers decided to intervene to combat the rise of non-consensual deepfakes with a new bill that would allow victims to take action against fake porn made in their likeness.

The DEFIANCE Act of 2024 was introduced by Senate Judiciary Committee Chairman Dick Durbin, D-Ill., Ranking Member Lindsey Graham, R-S.C., Sen. Josh Hawley, R-Mo., and Sen. Amy Klobuchar, R- Minn.

Taylor Swift’s legal team is considering legal action after sexually explicit fake images of the singer circulated online

The lewd images are themed around Swift’s fandom of the Kansas City Chiefs, which started after she started dating star player Travis Kelce

“Sexually explicit ‘deepfake’ content is often used to exploit and harass women, especially public figures, politicians and celebrities,” Durbin said in a statement introducing the DEFIANCE Act.

“While the images may be fake, the harm to victims from the spread of sexually explicit ‘deepfakes’ is very real.”

The bill builds on provisions of the Violence Against Women Reauthorization Act of 2022, which provides for action against fake explicit images.

It would allow victims of explicit fake images and videos generated through AI and other technology to take civil action against anyone who produces the images, possesses them with the intent to distribute them, or in some cases receives the deepfake images, knowing that the victim has not given permission for it.

It would apply to fake images and videos that depict the victim engaged in sexually explicit activities or nude.

Senate Judiciary Committee Member Lindsey Graham (left) and Chairman Dick Durbin (right) joined Senators Hawley and Klobuchar in introducing the DEFIANCE Act

The bill was introduced ahead of a hearing Wednesday on Big Tech’s failure to protect children from sexual exploitation online.

The website that posted fake porn of Swift and other celebrities had been doing so for years. Celeb Jihad is believed to be the origin of recent fake sexually explicit images of Swift.

They were then posted to

This isn’t the first time Swift has considered legal action against the website. Her legal team previously alerted Celeb Jihad after it published a fake photo of her topless in 2011.

According to an analysis that independent researcher Genevieve Oh shared with The Associated Press in December, more than 143,000 new deepfake videos were posted online last year, more than every other year combined. A 2019 study found that 96% of deepfake videos were non-consensual porn.

The DEFIANCE Act is one of several responses from federal officials in the wake of Taylor Swift’s fake porn scandal.

Last week, Congressman Tom Kean Jr. calls for safeguards to be put in place to combat an ‘alarming trend’ not just against celebrity victims, but against all young people across the country.

In response to a question last week, White House Press Secretary Karine Jean-Pierre said the administration was alarmed by the spread of explicit fake images. She said the Biden administration has called on Congress to address online harassment and abuse, but also called on Congress to take legislative action.

In May 2023, New York Congressman Joe Morelle introduced the Preventing Deepfakes of Intimate Images Act.

His legislation would criminalize the non-consensual production and sharing of AI-generated sexually explicit material.

Related Post