Did the United States fake the moon landings? Did UFOs land in Rendlesham Forest? Were the 9/11 attacks a CIA plot? You doubtless have a simple response to these questions. One that’s just three letters – or, more likely, two letters – long.
But how about this assertion: the world is ruled by a powerful and secretive cabal? Suddenly, those one-word responses don’t fit quite so easily. The world may not be run by a cabal of Illuminati. Or alien lizards. But there is an elite – the world’s top politicians and most powerful corporations – whose decisions affect the lives of all of us.
And yes, even in this age of transparency, many of their meetings and decisions are conducted behind closed doors.
Rational observation versus conspiracy theories
So where does rational observation end and conspiracy theorising begin? If only it were that simple. “We think there’s ‘conspiracy theory’ and then there’s acceptable political discourse,” said political scientist, Professor David Runciman. “But the difficulty is that there’s no clear dividing line where you can say, ‘We question it to this point, and beyond that questions are crazy’.”
Runciman is a director of Conspiracy and Democracy, a five-year, Leverhulme-funded project based at the Centre for Research in the Arts, Social Sciences and Humanities that is drawing together researchers across disciplines to assemble a natural history of conspiracy theories. He offers a succinct definition. “In the pejorative sense, a conspiracy theory is a view about the world in which the surface story is never the real story,” he said. “It’s a view without any limits to its scepticism or doubt. It’s a mindset in which nothing that contradicts the theory is taken as anything other than evidence that the theory is true.”
Such sceptical mindsets are currently grabbing headlines, with narratives around “fake news” and “alternative facts”. “There’s a conspiracy-theory scare right now,” said Runciman. “But you can’t blame digital technology for this. For people who want to bypass conventional sources of information, it’s easier to do it. But the propensity of people to believe these things isn’t a modern phenomenon.”
Emergence of conspiracy theories
Historian Dr Andrew McKenzie-McHarg, one of the Conspiracy and Democracy project’s researchers, is examining when we first started to notice conspiracy theories. When these theories emerged is a contentious question: “Some say antiquity, others the Renaissance. Others say the French Revolution, when we see the first modern conspiracy theories, around the Illuminati.”
Whatever the answer might be, it was only in relatively modern times that conditions arose that made possible the observation of the phenomenon. As McKenzie-McHarg points out, “The concept ‘conspiracy theory’ itself is from the 19th century, when it’s used in a forensic context at a time of increasingly scientific approaches to crime.”
In the 20th century the concept itself, fittingly enough, became the subject of a conspiracy theory. “There are those who claim that the words ‘conspiracy theory’ were introduced by the CIA to discredit people who were promoting these sorts of ideas,” said McKenzie-McHarg. “All my work is about demonstrating that the conspiracy theory about the concept ‘conspiracy theory’… is not true.”
In fact, as he points out, the origins of the phenomenon are deep-rooted. Ironically, it seems that this behaviour, which today we associate with those who feel impotent in the face of forces they can’t control, may bestow a sense of agency. McKenzie-McHarg links conspiracy theorising to the age-old notion of the scapegoat. “Identifying the scapegoat gives you the opportunity to do something – to marginalise or persecute those who have been identified as the guilty party. It’s far better to be doing something than nothing.”
Brains and beliefs
Neuroscientist Professor Paul Fletcher traces conspiracy theorising behaviour back to an even more fundamental origin: the brain. His work on delusion and hallucination points to a conclusion both startling and persuasive: that conspiracy theories, like other forms of belief, arise from the natural function of the mind.
“The brain – our minds – have an impossible task in making sense of a world that is noisy and ambiguous,” he said. “We have to take short cuts. We have to add our own evidence rather than being a passive receptacle of the world. The majority of our beliefs and precepts are useful – but they are not a facsimile of what’s out there.”
In other words, conspiracy theories, like more sanctioned forms of belief, are our attempts to craft an explanation of the world around us. And what’s crucial in determining whether those explanations are acceptable or not, is context. In many cultures, Fletcher said, it might be “perfectly reasonable” to believe that you were being communicated with by a godlike entity. Not so in Western society today.
Of course, diagnosable mental illness creates delusion and requires treatment. But as Fletcher points out, conspiracy theories arise from the processing of evidence in our brain, resulting in a conclusion that the majority of society does not share. And the opinions of that majority are formed in exactly the same way. “Beliefs are not the logical workings-out of evidence that we assume them to be – they are based on our own biases and assumptions.”
Bias, assumption and Patient Zero
And sometimes, those assumptions lead us astray. One striking example has been analysed by Dr Richard McKay, a Wellcome Trust Research Fellow, whose co-authored study of the Patient Zero narrative of the AIDS epidemic received global attention last year and appears in monograph later this year.
Among the study’s revelations was the fact that Patient Zero (an Air Canada flight attendant named Gaétan Dugas) was not the source of the outbreak, and that his designation as “zero” or “0” was a corruption of his abbreviated identifier in an early study as patient “O” – a study subject who resided “Outside of California”.
McKay echoes Fletcher’s assertion that bias and assumption are instrumental in creating beliefs – in this case, an entire hypothesis about the origin and spread of the AIDS epidemic. “When epidemics arise, there is a long history of trying to find out the first cases,” he said. “Overlapping with that, there have always been people trying to figure out, ‘How do we stop this?’ And then there’s another group focusing on ‘Why did that happen?’ and ‘Who can we blame?’ To an extent, which of those stories you subscribe to will depend on your world view, and which systems of belief and power you subscribe to.”
He believes the Patient Zero false narrative wasn’t a conspiracy theory, but rather a demonstration of how chosen narratives can win out over evidence, and an argument for the care with which we need to craft our explanations. This tendency partly explains why conspiracy theories are so very hard to disprove or dislodge: they are self-reinforcing, thanks to the information chosen for consideration in the first place.
Post-truth, post-factual and fake news
Understanding – and disseminating – how evidence can be used better is the task of one of Cambridge’s newest bodies, the Winton Centre for Risk and Evidence Communication, based in the Faculty of Mathematics. Executive Director Dr Alexandra Freeman said the centre is tasked with aiding key decision-makers and communicators, such as civil servants, doctors and journalists, to better grasp the evidence they use, and present it more effectively to their audience.
This could not be more relevant in 2017. It is a truism of the past year in politics that those responsible for clearly communicating evidence have failed to do so – or failed to cut through when they did. The 2016 EU referendum saw Michael Gove declare that “people in this country have had enough of experts”, while in the US presidential race, both the media and the public were fascinated by Donald Trump’s fluid use of fact. And in the wake of that acrimonious race, terms such as “post-truth”, “postfactual” and “fake news” have entered the mainstream. The need for a centre such as the Winton has never been clearer.
“I don’t think we can deal with conspiracy theorists directly,” Freeman explained. “There will always be those who don’t trust anyone, even the University of Cambridge’s Maths Department! Obviously we can try to establish our credentials for being trusted by being honest about everything we do, being open about the levels of uncertainty and disagreement in the evidence we are presenting, and providing reference sources for all our information.”
Conspiracy theories as critical commentaries
Which is why Dr Nayanika Mathur’s fieldwork at CRASSH is so intriguing – she is studying the contrasting attempts to introduce biometric ID in Britain and India. In the UK, the proposals were scrapped in 2010; in India, the Unique Identification project or Aadhaar is proceeding despite opposition.
“The [Indian] government has branded much criticism of the UID as conspiratorial,” Mathur explains. “Yet at least some of the criticism is being levelled not by crackpots, but by respected legal minds and grounded activists. By taking seriously the so-called conspiracy theories around the UID, I am attempting to unsettle the common understanding of conspiracy theorists as deluded or crazy. That means I can look at conspiracy theories as critical commentaries on the world, from which we can learn a lot.”
We live in an age when even scientists investigating an epidemic can get the narrative wrong, and in which those who govern us often fail badly at communicating their evidence-based arguments. Meanwhile those labelled conspiracy theorists, said Fletcher, may reach conclusions that run counter to the mainstream, but, in shaping – and warping – the evidence to fit their strong beliefs, they are not doing much different from what we all do, all the time.
The danger arises when they don’t accept facts that counter these beliefs, and stick to their own “alternative facts”. So, should we stop writing off conspiracy theorists, and instead view them – as Mathur does – as offering a critical commentary on our times? From Illuminati to ID cards, the latest crop of Cambridge research suggests that the history of weird ideas says as much about all of us, as it does about the people who believe them.
This article first appeared on the University of Cambridge website.