Fake Videos Are a Reality, Not Just a Threat

The threat of fake or doctored videos is officially no longer a threat, as House Speaker Nancy Pelosi can attest. The doctored videos of her that surfaced last week weren’t the first time detractors manicured video content to embarrass her.

While the doctored videos of Pelosi were spotted and outed quickly, it is fair to say that the technical ability to create deepfake videos is far ahead of the practical ability to spot them. Experts say virtually anyone with a laptop could have doctored the Pelosi videos.

Even when fake or doctored videos are outed, they still can circulate widely on social media, in some cases with a push from influencers – or a President of the United States. The fake video of Pelosi has been viewed millions of times on Facebook.

As we noted in an August 2017 Managing Issues blog, desktop technology exists to edit video and audio to make anyone say almost anything. In the Pelosi video, her natural speech pattern was distorted so she sounded drunk.

If fake videos were just the innocent stuff of parties or a good-natured roast, we could just sit back and laugh. Unfortunately, they aren’t just for fun. They are weapons to destroy a reputation or cut down a political opponent. In the partisan silos of today’s news media, fake videos can quickly become “fact.”

Circulation of political fake videos is calculated. Trump likes them because they share well with his aging political base. They also are red meat opportunities for Fox News personalities such as Sean Hannity, who frequently airs them. Some fake video creators defend their handiwork as “entertainment” that engages people who otherwise would shy away from politics.

High-profile individuals, corporations or politicians can’t ignore the need for 24/7 media monitoring. If there ever was a doubt, the specter of fake videos should squelch any hesitation. The task of media monitoring is no longer as simple as having someone read newspapers and clip relevant articles. Media monitoring now spans online news, social media, blogs, message boards, video channels, broadcast TV, radio and print – not just in the United States, but also internationally.  There are ample commercial choices that can provide some or all media monitoring.

Forensic tools exist to spot doctored photographs and videos. The Global Investigative Journalism Network posted this tutorial on techniques and tools to ferret out fake visuals, manipulated data, twisted facts and out-of-context information.

Being aware of coverage that affects you isn’t enough when it comes to video content. You or someone on your behalf needs to view it forensically to ensure the video is authentic and editing is contextually accurate and fair. This can be complicated that goes far beyond detecting a jump cut in a TV interview. In anticipation of an altered video scenario, you should add a new section to your crisis plan that identifies media monitoring options, go-to resources and potential responses.

Upon detecting a fake or doctored video, you need a capability to address it and its fallout. Unfortunately, you can’t simply raise your hand and call foul. Depending on the seriousness of the fake video content, you may need to mount an aggressive response.

An aggressive response should include:

  • Third-party verification that a video is fake or doctored.

  • The source video that is altered.

  • Identification of the responsible party who doctored the video, if known.

  • Calling out websites or channels that are promoting the fake video.

Political figures have little protection from slander, but they can ask surrogates and supporters to out the fakery and its malign motivation. Their communications staff can request traditional and mainstream media to write editorials or accept op-eds that condemn such political tactics. In Pelosi’s case, Facebook refused to ban the fake video of her. Twitter continued to allow it to be shared. YouTube said the fake video violated its standard of ethics. A spliced video of Joe Biden’s apology about inappropriate touching of women took just 19 hours to go from its originator’s keyboard to the Trump Twitter account.

Individuals and business leaders enjoy a little more legal protection from slander and can pursue legal remedies to have the fake video content taken down from its origin and a public statement admitting it was doctored. An apology would be nice, too.

Be aware that political or business figures willing to commission and post visual forgeries like to play rough and loose with the rules of fair play, including passing the blame on who is responsible. Responding in kind is a fool’s errand. But exposing such dirty tricks and affixing blame is perfectly fair – and smart if you have facts down cold.

Pelosi chose to shrug off the video and Trump’s reference to it. This wasn’t a strategic, not casual decision by Pelosi’s camp. She has accused Trump of self-impeachment, a coverup and in need of a staff intervention. Pelosi’s needling led Trump to call her “Crazy Nancy” based on “slurred words” in the fake video. Pelosi scored points with her political base and her fractious House Democratic caucus on both counts.

Whether a fake video response is frontal or subtle, a clear-eyed decision is required on how and when to respond. No response isn’t an option. It’s just like a trademark – you have to monitor and defend it against infringement or see your trademark devalued.