Lewd Taylor Swift AI images likely originated in a Telegram chat group before being viewed on X 45 MILLION times in just 19 hours as lawmakers call for legislation

Sexually explicit images of Taylor Swift generated by artificial intelligence emerged from a Telegram group before being shared millions of times around the world, analysts believe.

Swift, 34, is said to be deeply disturbed by the images, and members of Congress have renewed their calls to criminalize the sharing of pornographic, non-consensual deepfakes.

The images were first noticed on Wednesday and spread quickly. They were viewed 45 million times and reposted 24,000 times on X before being deleted 19 hours later, The Verge reported.

On Thursday, technical website 404 Media discovered that the images came from a Telegram group, dedicated to creating non-consensual AI-generated sexual images of women.

AI-generated explicit images of Taylor Swift were posted on the Celeb Jihad website, which was previously warned by the singer’s lawyers after he shared another doctored image in 2011. Pictured: Swift performing at the Eras Tour in Sao Paulo, Brazil, on November 24, 2023

The lewd images are themed around Swift’s fandom of the Kansas City Chiefs, which started after she started dating star player Travis Kelce

The images were generated by members of a Telegram chat group, who used Microsoft programs and shared solutions to get around Microsoft’s rules

Members of the group were annoyed by the attention the Swift images drew to their work, 404 Media reported.

“I don’t know whether to feel flattered or angry that some of these photos stolen by Twitter are of me,” said one user in the Telegram group, according to the site.

Another complained: “Which one of you mfs here takes shit and throws it on Twitter?”

A third replied: “If there’s any way to shut down and rob this motherfucker they’re such idiots.”

The images were not classic “deepfakes,” 404 Media reported, in which Swift’s face was superimposed over someone else’s body.

Instead, they are created entirely by AI, with members of the group recommending Microsoft’s AI image generator, Designer.

Microsoft does not allow users to generate an image of a person by entering the “Taylor Swift” commands.

But users of the Telegram group would suggest solutions, prompting Designer to create images of “Taylor ‘singer’ Swift.”

Swift pictured leaving Nobu restaurant after dining with Brittany Mahomes, wife of Kansas City Chiefs quarterback Patrick Mahomes, on Jan. 23

And instead of instructing the program to create sexual poses, users entered prompts with enough objects, colors and compositions to create the desired effect.

The Pennsylvania-born billionaire was one of the first people to fall victim to deepfake porn in 2017.

The news site reported that she was also one of the first targets of DeepNude, which generated nude images from a single photo: the app has since been removed.

Swift’s unpleasant situation has renewed the push among politicians for stricter laws.

Joe Morelle, a Democratic member of the House of Representatives representing New York, introduced a bill in May 2023 that would criminalize non-consensual sexually explicit deepfakes at the federal level.

On Thursday, he wrote on X: “Another example of the destruction deepfakes cause.”

A website called ‘DeepNude’ claims that with the click of a button you can “see any girl without clothes (sic).” A ‘standard’ package on ‘DeepNude’ lets you generate 100 images per month for $29.95, while $99 gets you a ‘premium’ of 420.

His fellow New Yorker, Rep. Yvette Clarke, also called for action.

“What happened to Taylor Swift is nothing new. Women have been targeted by deepfakes without their consent for years. And advances in AI have made creating deepfakes easier and cheaper,” she said.

“This is a problem on both sides of the aisle and even Swifties should be able to solve it together.”

Swift has not commented on the incident, and neither has X owner Elon Musk.

As of April 2023, the platform’s manipulated media policy bans media “that depicts a real person (who) has been fabricated or simulated, especially through the use of artificial intelligence algorithms.”

However, footage has often slipped through the net, and the situation worsened when Musk took over in October 2022 and gutted the moderation team.

Related Post