Residents are left baffled after Kari Lake films video to promote new Arizona politics watchdog media site…but not all is as it seems

Arizona residents are baffled after Kari Lake appeared to film videos promoting a new Arizona Watchdog media site and praising journalists.

Lake, who is running for U.S. Senate as a Republican and is a former gubernatorial candidate, is a staunch supporter of former President Donald Trump. She has denied losing the 2020 gubernatorial race and has long vilified the media.

But last week she appeared in three videos on the new site Arizona Agenda under the banner “Kari Lake Does us a solid.”

Lake promotes the site and the work of the journalists who founded the site in 2021 by former New York Times journalist Hank Stephenson.

However, the videos are not what they seem as they are ultra-realistic deepfakes. The site created them not only to promote itself, but also to highlight the danger that AI-generated videos pose to the November elections.

See if you can tell the difference between the AI ​​Lake and the real candidate:

The top video is the AI-generated Kari Lake, while the bottom is a real video of Lake shooting Arizona officials during the 2022 election Democrat Katie Hobbs triumphs over her in the race to become the next governor.

In the real video, Lake’s camera settings make her appear airbrushed and her surroundings are blurred, which is mimicked in the fake videos.

Stephenson initially introduces the deep fake as a real video, claiming that “much to our surprise,” despite it being “a frequent subject of our ridicule,” Lake “offered to film a testimonial about how much she loves the video.” Arizona Calendar.’

The article quickly shows that not everything is as it seems.

“Did you realize this video is fake?” asks the AI ​​Lake. ‘Over the next six months this technology will get a lot better. By the time the November elections roll around, you will barely be able to tell the difference between reality and artificial intelligence.”

Lake’s slightly unsynchronized lip movements are the telltale sign, but in a post-pandemic era where Zoom interviews often create this effect anyway, it’s not an obvious red flag.

Kari Lake, who has long ripped the media, is now appearing in new videos supporting an Arizona news site. But it’s not what it seems

The videos were a deepfake AI creation from the Arizona Agenda, a new political watchdog site, to show the dangers of AI ahead of the November elections. In the photo: the real Kari Lake on stage during various events

Lake’s slightly unsynchronized lip movements are the telltale sign of the AI ​​fake, but the site used a real interview to create its ultra-realistic model

In one video it is even noted that the mouth movement is not quite right and is not completely synchronized.

“Around the small boundaries of my face you can almost see the glitches in the matrix,” says the AI-Lake.

The article notes that the audio is quite close to her voice and it is difficult to tell a difference. However, the video of her face isn’t that far away, making the AI ​​easy to spot with a good eye.

‘What did you think?’ Stephenson’s article asks readers. ‘At what point did you get it? Did you know before you clicked that the setup was so unlikely?

“Did you at least notice it before she told you?” Or, like most people we’ve shown this to, did it take a second for your brain to catch up, even after our Deep Fake Kari Lake told you she was a fake?”

The site also notes that they created the video with “zero dollars” and asked a software engineer to take a few hours to create the videos.

Deep fakes are AI-generated media that mimic human voices, images and videos that can be mistaken as real.

Stephenson further warns that as technology is developing rapidly, “this is just the beginning.”

“The 2024 election will be the first in history in which any idiot with a computer can create convincing videos depicting fake events of global importance and publish them to the world in minutes,” the article said.

This fake AI-generated image was spread on social media claiming that former President Donald Trump stopped his motorcade to take a photo with this group of men. The image is not real

The creator behind this fake image claimed he is not a ‘photojournalist’, but a ‘storyteller’

Just two weeks ago, former President Donald Trump accused Democrats in Congress of using AI in a video collection of gaffes and verbal mistakes

Stephenson said the WashingtonPost the article serves as a warning about the “scary” potential of increasingly realistic fake videos, leaving voters baffled as to what to believe.

“When we started this I thought it would be so bad that no one would do anything about it, but I was blown away,” Stephenson told the newspaper.

‘And we are unsophisticated. If we can do this, anyone with a real budget can do the job so well that it will fool you, it will fool me, and that’s scary.”

Stephenson said the Arizona Agenda is “here to help” cut through the convincing fakes that “keep election officials, cybersecurity experts and national security officials up at night.”

AI has often been used to muddy the political waters. Just two weeks ago, former President Donald Trump accused Congress Democrats of use AI in a video collection of blunders and verbal mistakes.

Played during a hearing in the House of Representatives with a special counsel Robert Hur showed the video clips in which Trump mixed up the names of the heads of Hungary and Turkey, slurred his words and mixed up Nancy Pelosi and Nikki Haley.

Trump’s claim that Democratic staff used artificial intelligence technology or that the White House was in any way involved in the video.

Meanwhile, MAGA supporters actually used AI to create images of Trump being embraced by black people, a demographic Republicans continue to struggle in court.

A shocking report from BBC’s Panorama revealed that at least one prominent Trump supporter, Florida-based radio host Mark Kaye, admitted to creating the fake image.

An attack ad released by Florida Governor Ron DeSantis’ since-abandoned presidential campaign also used AI-generated footage of former President Donald Trump calling Dr. Anthony Fauci hugged.

The images at the top left, bottom center and bottom right of a Ron DeSantis ad appear to be AI-generated deep fakes

Taylor Swift was the target of sexually explicit deepfake images that went viral on X last month

The Fake footage showed Trump hugging and kissing Fauci, the director of the National Institute of Allergy and Infectious Diseases, who became synonymous with the US response to the COVID-19 pandemic.

More than 400 AI experts, celebrities, politicians and activists have sounded the alarm deeply fake technology in an open letter to lawmakers.

They argued that the growing number of AI-generated videos pose a threat to society due to the involvement of sexual images, child pornography, fraud and political disinformation.

The letter states that deep-fake technology deceives the public, making it more difficult to discern what is real on the internet, making it more important than ever to implement formalized laws ‘to protect our ability to recognize real people’ .

The calls for stricter regulations come after sexually explicit deepfake images of Taylor Swift went viral on social media last month.

Related Post