Skip to main content

Six Years Later, and the Deepfake Apocalypse Hasn’t Happened

· 4 min read
DeepMake

Ever since the term "deepfake" was first coined in 2017 --- describing a type of AI that can create fake but convincing images, audio, and videos --- the media began speculating about how deepfakes could change the world. Fake videos came with the possibility of widespread ethical issues and misuse. If you believed everything you read six years ago, you would have thought that we were headed for a deepfake apocalypse --- a world where seeing is no longer believing and video footage can no longer be trusted. 

Some of the rather dystopian projections on how deepfakes could open a Pandora's box of problems include:

  • The potential for deepfakes to contribute to widespread misinformation.

  • The prospect of them becoming tools for corporate spying or blackmailing.

  • The ability to use them for nefarious acts like identity theft or fraud.

And deepfakes have indeed been used for deception in some high-profile instances. Here are a few recent examples: 

  • Last year, a deepfake video representing Ukrainian President Volodymyr Zelenskiy told soldiers to lay down their weapons. 

  • In April 2023, the Republican National Committee (RNC) released an attack ad featuring AI-generated images of President Joe Biden.

  • In June 2023, the Ron DeSantis campaign released a video containing deepfake images of former President Donald Trump hugging Dr. Anthony Fauci, the former White House chief medical advisor.

These videos, however, are isolated incidents, not daily occurrences. While these are high-profile and widely circulated examples, they're not part of a larger pattern or escalating problem of deepfakes used to spread misinformation about politicians. 

After all, it's a political tradition to make opponents look bad. Creative editing, doctored photos, and taking comments out of context are already part of the playbook and have been for decades.

The political deepfake videos themselves have had little influence, except to reinforce that deepfakes exist --- bringing needed awareness to the fact that not everything you see is real. With the ad from the RNC, the big news wasn't the ad itself, but that it was a fake. More people saw news stories about the AI-generated images in the ad than saw the ad run.

The videos above were of lower quality and fairly easy to identify as deepfakes. As videos get more realistic --- and harder to spot --- there's still a concern that deepfakes will fool more people. This seems realistic since studies show that people already overestimate their ability to spot a fake.

Yet over the past six years, even with seismic events like elections and pandemic-era misinformation spikes, fears of an approaching deepfake apocalypse haven't come true as predicted. We believe there are a few reasons for this.

  1. Deepfakes are primarily used within certain niche areas like film production and celebrity impersonations. Most of the people who make deepfakes simply aren't interested in bringing about the fall of society. 

  2. There's a societal awareness of the potential of doctored images and deepfake videos. Viewers as a whole are aware that there can be fakes out there, and their ability to spot them is increasing.

  3. There are enough technologically savvy people around (like those of us at Deepmake) who know what to look for and who point out the fakes when we see them. 

While there might still be valid concerns centered around the misuse of AI for generating deceptive representations, we haven't seen much outside of the political examples presented earlier and some instances of celebrities depicted in pornography. Our news feeds aren't crammed full of enhanced artificial manipulations every day --- unless you count those annoying Snapchat filters, which I'd argue are the most flagrant (and pointless) misuse of artificial imaging technology. But other than that, our reality seems far less dramatic than initially projected. The fabled "deepfake apocalypse" remains a figment of some overactive imaginations. 

Deepfake technology has exciting possibilities, and Deepmake is committed to creating intuitive tools for AI-powered video editing. Stay tuned for more information about our latest plugin to make deepfakes and VFX easier.