X suspends account that posted Taylor Swift AI porn – only for another account to show it – as same graphic images now circulate on Facebook and Instagram
- The images also circulate on Facebook and Instagram
- Reddit has taken action and removed a channel called ‘TaylorSwiftLewd’
- They have the singer posing provocatively in Kansas City Chief gear
X has suspended an account that posted AI porn of Taylor Swift, but several others have already surfaced with the same graphics.
The extremely graphic AI-generated images, which showed the singer posing provocatively in Kansas City Chief gear, sparked outrage among her fans on Thursday, with many demanding legal action be taken.
The backlash led to the suspension of one X account that shared the images, but not before they were shared by dozens of other accounts.
Moreover, the images also circulate on Facebook and Instagram.
The new images show Swift in various sexualized poses and are reportedly from a website that posts AI-generated pornographic images of celebrities.
X has suspended an account that posted AI porn of Taylor Swift, but several others have already surfaced with the same graphics
The extremely graphic AI-generated footage (not shown), which showed the singer posing provocatively in Kansas City Chief gear, sparked outrage among her fans
Thursday morning, “Taylor Swift AI” was the trending topic on X, formerly known as Twitter.
Reddit now appears to have taken action against the fake images, removing posts containing them and banning a channel called ‘TaylorSwiftLewd’.
Reddit does not allow posting intimate or sexually explicit photos of someone without their consent, but DailyMail.com has discovered that the images are still circulating on the site and on 8chan.
DailyMail.com has reached out to Reddit, Meta and X for comment on this story.
The AI images do not appear to be circulating on TikTok, which does not allow any form of nudity.
DailyMail.com has discovered that the images are still circulating on Reddit
Reddit appears to have taken action against some of the fake images, removing posts containing them and banning a channel called ‘TaylorSwiftLewd’.
X, in turn, allows some nudity, which makes moderating this type of content more difficult.
DailyMail.com has seen the images in question but will not publish them.
They are the latest example of the dangerous rise in popularity of deepfake porn websites, where celebrities and others feature their likenesses in explicit videos and photos without giving consent.
Meta has recently taken steps to supposedly make their platforms safer for children, including banning teens under 18 from messaging strangers on Instagram or Facebook.
However, both Facebook and Instagram are currently flooded with fake Swift pornography.
Non-consensual deepfake pornography is illegal in Texas, Minnesota, New York, Virginian, Hawaii, and Georgia. In Illinois and California, victims can sue the creators of pornography in court for defamation.
“I need the entire adult Swiftie community to log into Twitter, search the term “Taylor Swift AI,” click on the media tab, and report every AI-generated pornographic photo of Taylor they can see, because I fucking done with this BS. Get it together, Elon,” one irate Swift fan wrote.
“Man this is so inappropriate,” another wrote. While another said: ‘Whoever takes those Taylor Swift AI photos is going to hell.’
“Whoever is creating this waste should be arrested. What I saw is absolutely disgusting, and this kind of thing should be illegal… we MUST protect women from this kind of thing,” another person added.