For US adversaries, Election Day won’t mean the end to efforts to influence Americans

WASHINGTON — Soon the votes will be cast, the polls will close and a campaign marked by assassination attempts, hostility and fear will come to an end. But for America’s adversaries, the work to interfere with American democracy may be entering its most critical phase.

Despite all the attention attempts to spread disinformation in the months before the Nov. 5 election, the hours and days immediately after voting ends could offer foreign opponents Russia, Iran And China or domestic extremist groups the best chance to tamper with the American decision.

That’s when Americans go online to check the latest results or share their opinions as the votes are tabulated. And that’s when a blurry photo or AI generated video of alleged vote manipulation could do the most damage, potentially translating online outrage into real-world action before authorities have time to investigate the facts.

It is a threat that is taken seriously intelligence analystselected officials and technology executives, who say that while there has already been a steady build-up of disinformation and influence operations, the worst may be yet to come.

“It’s not like at the end of election night, especially considering how close this election will be, that this will be over,” he said. Senator Mark Warnera Democrat from Virginia who chairs the Senate Intelligence Committee. “One of my biggest concerns is the level of misinformation. In fact, disinformation that may come from our opponents after the polls close could be as important as anything that happens before the polls close.”

Analysts are more blunt, warning that a particularly effective piece of disinformation could be devastating to public confidence in the election if it spreads in the hours after the polls close, and if the group behind the campaign manages to target a particularly important swing state or voting bloc.

Possible scenarios include out of context images of election workers reused to demonstrate alleged fraud, deepfake video from a presidential candidate admitting to cheating or a robocall directed at non-English speakers warning them not to vote.

When a false or misleading claim circulates weeks before the election, there is time for local election officials, law enforcement or news organizations to gather the facts, correct any untruths and publicize them. But if someone distributes a misleading video or photo the day after the election that is intended to make a large portion of the electorate distrust the results, it can be difficult or even impossible to determine the truth.

It happened four years ago, when there were a lot of lies about the 2020 results led to the January 6, 2021 attack at the US Capitol. Often those arrested on charges of attempting to disrupt the transition of power have cited this election fraud exposed stories that circulated shortly after election day.

A particularly close election in a handful of swing states could further increase that risk, making rumors more likely suitcases with illegal ballots in Georgia, to cite an example from 2020, could have a major impact on perception.

President Joe Biden’s victory over Donald Trump in 2020 wasn’t particularly closeAnd no irregularities large enough to influence the outcome found — and yet false claims of election fraud were still widely accepted by many supporters of the Republican, who is running for president again.

The relatively long run-up to Inauguration Day on January 20 gives those who want to cast doubt on the results ample time to do so. propaganda agencies in Moscow or extremist groups in the US such as the Proud Boys.

Ryan LaSalle, CEO of the cybersecurity firm Nisos, said he won’t feel relief until a new president is sworn in without serious problems.

“The time to remain most focused is now through the peaceful transition of power,” LaSalle said. “That’s when real activity can happen, and that’s when they have the greatest opportunity to have an impact on that peaceful transfer.”

Another risk, according to officials and technology companies, is that Russia or another adversary would try to hack into a local or state election system – not necessarily to change votes, but as a way to get voters to question the security of the system.

“The most dangerous moment, I think, will be 48 hours before the election,” Microsoft President Brad Smith told lawmakers on the Senate Intelligence Committee last month. The hearing focused on U.S. technology companies’ efforts to protect the election from foreign disinformation and cyberattacks.

Election disinformation first emerged as a powerful threat in 2016then Russia hacked in Democrat Hillary Clinton’s campaign and created networks of fake social media accounts to pump out disinformation.

The threat has only increased as social media has become an important source of information and news for many voters. Content designed to divide Americans and make them distrust their own institutions is no longer tied only to election seasons. Intelligence officials say Russia, China and other countries will only expand their use of online disinformation and propaganda in the future, a long-term strategy that looks beyond any election or candidate.

Despite the challenges, election security officials are Americans quickly reassured that the American election system is immune to any attack that could change the outcome of the election. While influence operations may aim to sow distrust about results, improvements to the system make it stronger than ever when it comes to efforts to change votes.

“Malicious actors, even if they tried, could not have such a large-scale impact that there would be a material effect on the outcome of the election,” Jen Easterly, director of the U.S. Cybersecurity and Infrastructure Security Agency, told The Associated Press.