Microsoft chief warns more deepfake threats could be coming soon

>

When it comes to deepfakes, what we’ve seen so far is just the tip of the iceberg. For the foreseeable future, we won’t know for sure if the person we’re talking to on a video call is real, or an impostor, and scammers won’t have a hard time creating a full chronology of fake videos to back up their claims, whether people are the make people believe the legitimacy of an offer or campaign.

These harrowing predictions come from Eric Horvitz, Microsoft’s chief science officer, in a new research paper entitled “On the Horizon: Interactive and Compositional Deepfakes.”

Deepfakes are essentially “photoshopped” videos. Using artificial intelligence (AI) and machine learning (ML), a threat actor can create a video of a person saying things they never said. Now, according to Horvitz, crooks are ready to take it to the next level. Interactive deepfakes are exactly what you’d expect – real-time videos that users can interact with, which in reality are completely fake.

synthetic history

In contrast, compositional deepfakes are described as “sets of deepfakes” designed to integrate over time with “observed, anticipated, and manipulated world events to create compelling synthetic histories.”

“Synthetic histories can be constructed manually, but may one day be guided by adversarial generative explanation (AGE) techniques,” Horvitz added.

He also says that in the near future it will be almost impossible to distinguish fake videos and content from authentic ones: “In the absence of mitigations, interactive and compositional deepfakes threaten to bring us closer to a post-epistemic world, where it is indistinguishable from fiction. .”

This absence of mitigation stems from the fact that threat actors can equate artificial intelligence with analytics tools and develop deepfake content that can fool even the most sophisticated detection systems.

“With this process at the root of deepfakes, neither pattern recognition techniques nor humans will be able to reliably recognize deepfakes,” Horvitz noted.

So the next time a family member calls from abroad asking for money to pay the rent, make sure it’s not a fraud posing as (opens in new tab) your loved ones.

Through: VentureBeat (opens in new tab)

Related Post