Information warfare abounds, and everyone online has been drafted whether they know it or not.
Disinformation is deliberately generated misleading content disseminated for selfish or malicious purposes. Unlike misinformation, which may be shared unwittingly or with good intentions, disinformation aims to foment distrust, destabilise institutions, discredit good intentions, defame opponents and delegitimise sources of knowledge such as science and journalism.
Many governments engage in disinformation campaigns. For instance, the Russian government has used images of celebrities to attract attention to anti-Ukraine propaganda. Meta, parent company of Facebook and Instagram, warned on November 30 that China has stepped up its disinformation operations.
Disinformation is nothing new, and information warfare has been practiced by many countries, including the US. But the internet gives disinformation campaigns unprecedented reach. Foreign governments, internet trolls, domestic and international extremists, opportunistic profiteers and even paid disinformation agencies exploit the internet to spread questionable content. Periods of civil unrest, natural disasters, health crises and wars trigger anxiety and the hunt for information, which disinformation agents take advantage of.
Certainly it’s worth watching for the warning signs for misinformation and dangerous speech, but there are additional tactics disinformation agents employ.
It’s just a joke
Hahaganda is a tactic in which disinformation agents use memes, political comedy from state-run outlets, or speeches to make light of serious matters, attack others, minimise violence or dehumanise, and deflect blame.
This approach provides an easy defense: If challenged, the disinformation agents can say, “Can’t you take a joke?” often followed by accusations of being too politically correct.
Shh...tell everyone
Rumor-milling is a tactic in which the disinformation agents claim to have exclusive access to secrets they allege are being purposefully concealed. They indicate that you will “only hear this here” and will imply that others are unwilling to share the alleged truth – for example, “The media won’t report this” or “The government doesn’t want you to know” and “I shouldn’t be telling you this … .”
But they do not insist that the information be kept secret, and will instead include encouragement to share it – for example, “Make this go viral” or “Most people won’t have the courage to share this.” It’s important to question how an author or speaker could have come by such “secret” information and what their motive is to prompt you to share it.
People are saying
Often disinformation has no real evidence, so instead disinformation agents will find or make up people to support their assertions. This impersonation can take multiple forms. Disinformation agents will use anecdotes as evidence, especially sympathetic stories from vulnerable groups such as women or children.
Similarly, they may disseminate “concerned citizens’” perspectives. These layperson experts present their social identity as providing the authority to speak on a matter; “As a mother …,” “As a veteran …,” “As a police officer ….” Convert communicators, or people who allegedly change from the “wrong” position to the “right” one, can be especially persuasive, such as the woman who got an abortion but regretted it. These people often don’t actually exist or may be coerced or paid.
If ordinary people don’t suffice, fake experts may be used. Some are fabricated, and you can watch out for “inauthentic user” behavior, for example, by checking X – formerly Twitter – accounts using the Botometer. But fake experts can come in different varieties.
- A faux expert is someone used for their title but doesn’t have actual relevant expertise.
- A pseudoexpert is someone who claims relevant expertise but has no actual training.
- A junk expert is a sellout. They may have had expertise once but now say whatever is profitable. You can often find these people have supported other dubious claims – for example, that smoking doesn’t cause cancer – or work for institutes that regularly produce questionable “scholarship.”
- An echo expert is when disinformation sources cite each other to provide credence for their claims. China and Russia routinely cite one another’s newspapers.
- A stolen expert is someone who exists, but they weren’t actually contacted and their research is misinterpreted. Likewise, disinformation agents also steal credibility from known news sources, such as by typosquatting, the practice of setting up a domain name that closely resembles a legitimate organisation’s.
You can check whether accounts, anecdotal or scientific, have been verified by other reliable sources. Google the name. Check expertise status, source validity and interpretation of research. Remember, one story or interpretation is not necessarily representative.
It’s all a conspiracy
Conspiratorial narratives involve some malevolent force – for example, “the deep state” – engaged in covert actions with the aim to cause harm to society. That certain conspiracies such as MK-Ultra and Watergate have been confirmed is often offered as evidence for the validity of new unfounded conspiracies.
Nonetheless, disinformation agents find that constructing a conspiracy is an effective means to remind people of past reasons to distrust governments, scientists or other trustworthy sources.
But extraordinary claims require extraordinary evidence. Remember, the conspiracies that were ultimately unveiled had evidence – often from sources like investigative journalists, scientists and government investigations. Be particularly wary of conspiracies that try to delegitimise knowledge-producing institutions like universities, research labs, government agencies and news outlets by claiming that they are in on a cover-up.
Good vs evil
Disinformation often serves the dual purpose of making the originator look good and their opponents look bad. Disinformation takes this further by painting issues as a battle between good and evil, using accusations of evilness to legitimise violence. Russia is particularly fond of accusing others of being secret Nazis, pedophiles or Satanists. Meanwhile, they often depict their soldiers as helping children and the elderly.
Be especially wary of accusations of atrocities like genocide, especially under the attention-grabbing “breaking news” headline. Accusations abound. Verify the facts and how the information was obtained.
Are you with us or against us
A false dichotomy narrative sets up the reader to believe that they have one of two mutually exclusive options; a good or a bad one, a right or a wrong one, a red pill or a blue pill. You can accept their version of reality or be an idiot or “sheeple.”
There are always more options than those being presented, and issues are rarely so black and white. This is just one of the tactics in brigading, where disinformation agents seek to silence dissenting viewpoints by casting them as the wrong choice.
Turning the tables
Whataboutism is a classic Russian disinformation technique they use to deflect attention from their own wrongdoings by alleging the wrongdoings of others. These allegations about the actions of others may be true or false but are nonetheless irrelevant to the matter at hand. The potential past wrongs of one group does not mean you should ignore the current wrongs of another.
Disinformation agents also often cast their group as the wronged party. They only engage in disinformation because their “enemy” engages in disinformation against them; they only attack to defend; and their reaction was appropriate, while that of others was an overreaction. This type of competitive victimhood is particularly pervasive when groups have been embedded in a long-lasting conflict.
In all of these cases, the disinformation agent is aware that they are deflecting, misleading, trolling or outright fabricating. If you don’t believe them, they at least want to make you question what, if anything, you can believe.
You often look into the things you buy rather than taking the advertising at face value before you hand over your money. This should also go for what information you buy into.
H Colleen Sinclair is Associate Research Professor of Social Psychology, Louisiana State University.
This article was first published on The Conversation.