Disinformation is more elusive to counter because it consists of manufactured facts and can attract viral social media sharing by creating a stunning first impression.
Disinformation is not the same as misinformation, which can be a slip of the tongue or an inadvertent inaccuracy. Misinformation can be acknowledged and corrected. Disinformation involves intentional lying as part of a malign agenda. There is no off-the-shelf remedy to cure disinformation, which means its toxicity can spread unimpeded like pollution in a rushing river.
Misinformation and disinformation both can be amplified through social media. However, watchful eyes can and do spot misinformation and challenge it with factual information. Disinformation is more elusive to counter because it consists of manufactured facts and benefits from creating a stunning first impression that makes ripe of social media sharing.
Battling disinformation saps the energy and resources of change advocates, which can undermine, delay or even doom their efforts, achieving the goal of the authors of the disinformation. The tools of the disinformation trade are becoming more accessible and, as a result, more tempting.
It would be fair to say false claims can be and frequently are intended to feed polarization, break down trust and deflect attention away from a proposed change. Disinformation forces advocates for change to address disinformation and its aftermath at the same time as trying to make a case for the change, which can be a tall order. A common reason for pushing aside a proposed change is that it is “too complex” or “too controversial”, the residue of an effective disinformation campaign.
Advocates can’t fight back against disinformation unless they know it is circulating. That argues for robust monitoring to pick up signs of disinformation as early as possible. Disinformation may originate in the shadows to preserve the anonymity of the source, so monitoring must go beyond news clippings to include social media, alternate news outlets, books and documentaries. A number of tools are on the market for both advocates and information consumers.
Dirty fighting carries a lot of baggage as a tactic, but it may be warranted when unmasking those responsible for disinformation. The best comeback to disinformation is to expose those responsible for brewing up and spreading the false-flag information.
Your grandfather’s disinformation isn’t the same as contemporary disinformation. Deep fakes from Photoshopped images, manipulated videos and deceptive ads are examples of striking modern disinformation innovations that seek to create the appearance of reality. Cyber-hacking can be employed to spy and plant disinformation in a convenient view of intended audiences. Bots can invade and conquer your social media campaign.
These are flagrant fouls in the public arena, but someone has to blow the whistle or the foul goes undetected and unpunished. Battling disinformation saps the energy and resources of change advocates, which can undermine, delay or even doom their efforts, achieving the goal of the authors of the disinformation. Social media platforms such as Twitter and Facebook are taking steps to limit or block distribution of false claims, but they don’t do the grunt work of exposing the authors and beneficiaries of the false claims.
Principled advocates subscribe to codes of ethics, but those codes in many cases don’t contemplate the dirty tricks that technology has made possible. Even if these codes are updated, the designers of disinformation usually operate outside the rules and ethics of principled advocacy.
It is a sad state of affairs that advocates for major change must develop strategic plans that anticipate shadowy opposition in the form of disinformation. It has been standard practice for advocates to know the facts of their opponents and be able to articulate the opposition case as well or better than their opponents. Disinformation upsets that practice because there aren’t facts to dispute or defeat. Advocacy strategic plans must contain contingency plans to respond to disinformation campaigns. This will require creativity and a bit of daring because of the ill-structured nature of disinformation.
A crucial element of gaming a response against disinformation is identifying the most likely lanes it will travel. This is similar to conducting an issue audit in preparation of a crisis plan. You explore all the potential vulnerabilities of your issue and its sponsors, which is exactly what disinformation agents will do. Then analyze how vulnerabilities could be exploited. These potential scenarios merit attention, either to minimize the vulnerability or erect sensors to detect the spread of disinformation.
Dismissing the potency of disinformation is foolhardy. The sudden rise of Qanon, a movement animated by a web of conspiracy theories, is proof that disinformation can be inexplicably infectious. Discounting the possibility of a disinformation campaign could derail your proposed change is equally foolhardy. The tools of the disinformation trade are becoming more accessible and, as a result, more tempting.