Making Sense of Conspiracy Theories

Dec 4, 2020 —

Most conspiracy thinking does not create brand new theories, but instead assembles speculations out of existing narratives, images and fears. Like viruses themselves, conspiracy theories adapt to new environments, mutating and recombining strands of cultural DNA. People who already view the world through a lens of a conspiracy theory quickly interpret current events as a part of that conspiracy. This is what is happening with the coronavirus pandemic. In these theories the identity and ultimate goal of the conspiracy is often hazy and shifting, but the basic story is that the pandemic is part of a much larger plot by a group of unaccountable and secretive elites who have been in control of events for decades, perhaps even centuries. This is one of the main narrative formulae of conspiracy theories, and we’ve seen versions of this before, from stories about the Illuminati, to recurrent outbursts of antisemitism. As with most conspiracy theories, the coronavirus accusations often make a large, speculative leap from some known facts to draw a seemingly surprising conclusion, but a conclusion that is often known in advance. Unlike scientific theories that test a hypothesis against new evidence, conspiracy theories often go in search of factoids that will “confirm” an existing conclusion. Yes, for example, the Gates Foundation did provide funding to the Pirbright Institute in the UK to aid its work on a vaccine for a strain of coronavirus that affects livestock poultry; and yes, Gates has long advocated and funded vaccination programmes for the world’s poor, and has taken a keen interest in promoting preparedness for potential pandemics, including a scenario-planning exercise coordinated by Johns Hopkins University in October 2019 that modelled an outbreak of a flu virus. But neither of these verified facts warrants the conclusion that Gates is the mastermind of a fiendish conspiracy to institute mind control or genocide on the world’s population.

Alongside billionaire philanthropists such as Bill Gates and George Soros (with the latter often a dog-whistle call to antisemitic theories), the conspiracy theorists sometimes blame actual global institutions like the United Nations and World Health Organisation, as well as imaginary ones such as the Illuminati (who were a real secret society of students promoting Enlightenment philosophy in Bavaria in the late eighteenth century, but who disappeared after they were banned by the authorities after little over a decade in existence). The sprawling community of QAnon believers – at first in the US, but now spreading rapidly in various European countries including the UK – soon wove the emergence of COVID-19 into their existing conspiracy fantasy: a lurid theory that a cabal of Satan-worshipping paedophiles – encompassing the so-called “Deep State,” leading Democrats, the mainstream media and the Hollywood elite – is secretly controlling the world. They are convinced that President Trump is fully aware of what is happening, and, when the prophesised moment is right, will unleash a second civil war (the “Coming Storm”), leading to the arrest and execution of all the enemies. The whole plot is supposedly being revealed in a series of cryptic messages posted on fringe online message boards inhabited by the alt-right by a figure called Q, an anonymous whistle-blower working with the intelligence agencies with top-level security clearance. The followers of Q obsessively speculate on the meaning of these obscure clues, that read as if they are a spoof of spy talk gleaned from trashy thrillers, but now increasingly wrapped up in the language of evangelical prophesy. The pandemic has added fuel to the flames of the QAnon conspiracy theory, drawing more people into the online community that then began to spill over into real-world anti-lockdown demonstrations.

Hosted on libertarian, wilfully politically incorrect and deeply misogynistic platforms like 4Chan and 8kun, QAnon has since its emergence in 2017 allowed its followers to indulge in masculinist fantasies of armed resistance to those they view as an unpatriotic elite. Yet with the coronavirus pandemic this alt-right conspiracy community has begun to converge with the New Age wellness community. These new converts, many of whom are women, are concerned that the pandemic will lead to mandatory vaccination programmes. Their Instagram posts are not the hate-filled, sexist, racist and sarcastic memes that are the staple of alt-right message boards, but instead offer tear-jerking calls to #savethechildren from the evil conspiracy of paedophiles, all in pastel colours.

What are we to make of these conspiracy theories? As Hollywood films have long recognised, part of the appeal of conspiracy thinking is that dizzying, exhilarating moment of panic that even the hardened sceptic can succumb to: what if it’s all true? What if everything we are told is a lie? For most people, however, that initial moment of conspiracy rush subsides. For those who are not true believers, it is then tempting to dismiss conspiracy theories as, at best, cynical shit-posting, or, at worst, the delusional ravings of a dangerous political fringe. But conspiracy theories are widespread, and they matter. Surveys show that most people now believe in at least one conspiracy theory. In the case of the coronavirus pandemic, roughly a quarter of people in the UK and the US believe that the pandemic was deliberately planned, for example. And belief is connected to behaviour. At the extreme end, in the UK and other European countries there were over a hundred attacks on mobile phone masts. Perhaps more worrying, a third of respondents in surveys say that they would refuse a COVID-19 vaccination, even if it had been approved as safe.

Why do people believe in conspiracy theories? How do conspiracy theories work? And what should we do about them?

There are three cardinal rules of conspiracy theory: nothing is as it seems; nothing happens by accident; and everything is connected. Conspiracy theories provide alternative explanations of significant happenings like wars, assassinations and plagues, and are usually presented in opposition to received wisdom. In some countries and regimes, however, they are the official version of events. Conspiracy theories usually start from visible effects in the present, and construct a story based on the conviction that someone deliberately planned to bring those events about. That well-known philosopher of history, Homer Simpson, concluded that “shit happens,” but the conspiracy theorist insists that there are no accidents or coincidences in history. Conspiracy theories ask “who benefits?,” and work their backwards to identify the conspirators who must therefore have planned everything. If the coronavirus pandemic is likely to lead to some pharmaceutical companies making big profits by selling vaccines, the logic is that they must have planned it in advance. Conspiracy theories often (but not always) are populist in outlook, seeing history as a struggle between the innocent people and the corrupt elites, by-passing the usual structures of party politics. In general, they divide the world into a battle between good and evil, insiders and outsiders, Us vs Them, finding convenient scapegoats to blame for complex problems. In some cases conspiracy theories serve to forge a sense of community: QAnon believers can resemble a cult at times, for example, and particular online conspiracy spaces can generate a powerful sense of being one of the enlightened few who are in-the-know. But often that sense of community and identity is constructed by blaming other groups for social ills. In more extreme versions, the conspirators are portrayed as evil and subhuman, who will stop at nothing to achieve their devilish plans. Conspiracy theories are frequently apocalyptic in tone, insisting urgently that the future of the nation or the liberty of the people hangs by a thread.

Conspiracy theories are often accused of simplifying complex events. It’s true that they do tend to create simplistic overarching explanations. But at the level of detail, conspiracy theories often end up constructing phenomenally complicated accounts. One reason they do this is because they start from the assumption that everything is connected: even seemingly unconnected events and people are all part of a fiendishly convoluted plot. Unlike scientific theories, conspiracy theories are usually unfalsifiable. If you try and debunk them by pointing to the lack of credible supporting evidence, the conspiracy theorist will often claim that the lack of evidence is proof in itself: the conspiracy is so all-powerful, the argument goes, that they have managed to cover up any trace of their existence. If people in the media, government or science seem to have evidence that undermines the theory, then they must be shills for the conspiracy. In this way, conspiracy theories become ever more elaborate, relentlessly incorporating any conflicting evidence into an ever larger plot, even if the fundamental story arc is depressingly simplistic and repetitive. For this reason, it can be incredibly frustrating to argue against conspiracy theorists, but you have to admire their ingenuity in providing an answer to any conceivable objection. Making the situation worse, conspiracy theorists often create a circular trail of reference: when you follow up their obsessive footnotes and links, you quite often find they refer to other conspiracy theorists, who in turn refer to others, and so on in a circle of citation that creates a veneer of credibility. What makes the situation more troubling now is that conspiracy theories often suggest that traditional sources and institutions of authoritative information – professional journalism, the law, the civil service, governing officials, science – are all part of the conspiracy. There is an increasing knee-jerk response to delegitimise all forms of expertise as corrupt and self-serving. In this situation, there is diminishing hope that appealing to facts and experts will cut any ice with a committed conspiracy theorist.

Arguing against conspiracy theorists is difficult not just because of the unfalsifiability of their views. It is also because in many cases their beliefs are an expression of a deeply held worldview. In the same way that people with a strong religious commitment often turn to theological arguments to help rationalise their emotional investment in their faith, so too do conspiracy theories serve as a way to justify strong feelings of resentment and injustice. (And it therefore makes sense that recognisably modern, all-encompassing conspiracy theories begin to emerge in the late eighteenth century, at the moment when religious belief in providence as an over-arching explanation of how everything has been plotted by God began to wane.) Although for many people flirting with conspiracy theories is no more than idle speculation and or cynical provocation, for some committed believers a conspiracist mindset is tied up with their life history and sense of identity. Many QAnon and alt-right conspiracy believers, for example, talk about “red pilling,” the moment when they came to feel that everything the mainstream media are telling them is a lie. Changing your mind about a conspiracy theory is therefore not simply a matter of revising your opinion about a set of disputed facts in the light of new evidence. It might mean unravelling your sense of who you are and how the world works.

Although there can be a kernel of truth in most conspiracy theories, they often make a speculative leap beyond what is warranted by the evidence. But even if in a literal sense they might not be accurate, that doesn’t mean that they are completely unhinged from reality. In fact, in many cases conspiracy theories give voice to a distrust of the authorities and the powerful that is understandable. Conspiracy rumours about HIV/AIDS being created as a biowarfare agent to commit genocide on the African American population are unfounded, for example, but they speak to a long history of neglect on the part of the medical establishment and the US government (the most notorious example being the Tuskegee syphilis study, in which doctors continued to monitor the long-term effects of syphilis in a group of black men, long after antibiotic treatment for the disease became available). Likewise with some coronavirus conspiracy theories, it is not unreasonable to have concerns about vaccinations, or to have doubts about the government’s approach to balancing the demands of health and economy in its response to the pandemic, or even to have misgivings about the financial incentives of multinational pharmaceutical companies. That doesn’t mean that the specific allegations are true, but conspiracy theories nevertheless often promise to explain What Is Really Going on.

The conspiracy theorist tends to adopt a stance of savvy, world-weary cynicism, always expecting the worst of officials and experts, all too ready to suspect anyone’s motives as corrupt. This default “hermeneutic of suspicion” has much in common with the politically progressive project of critique that also tries to delve beneath the confusion of surface detail to find the real sources of power that shape our societies. Indeed, many commentators have worried that precisely because critique has come to resemble conspiracy theory it has run out of steam. But at the same time the conspiracy theorist’s view of how history works is oddly naïve – gullible even. It can end up distracting us from a more convincing explanation of the world’s problems, and diverting political energies from actually doing something about them. Where those trained in social sciences see the complex interaction of social and economic forces, powerful institutions, ideological persuasion and conflicts of vested interests, the conspiracy theorist personifies those abstractions and focuses instead on a story of the intentional actions of a small, but hidden group of conspirators. For the social scientist, there is no need for a conspiracy theory to explain why, for example, the 1% succeed in shaping the world to their will. The elite as a social class with shared interests openly pursue their transparent goals of self-advancement, and it does not take a secret conspiracy of obscure plotters for them to be able to achieve this. In addition, experience suggests that what we’re witnessing with the pandemic is not the result of some four-dimensional chess (whether by Dominic Cummings or the Illuminati), but an omnishambles created by a government finding itself serially out of its depth, convinced of its superior wisdom and repeatedly resorting to cronyism.

Conspiracy theory, we might therefore say, functions as a form of pop sociology, with the crucial difference that (in Michael Butter’s terms) it engages either in deflection (it identifies the right issue, but blames the wrong people) or distortion (it latches onto the right group to blame, but for the wrong reasons). It is not surprising that the more that people feel powerless in the face of political, financial and technological vested interests, the more they turn to narratives involving powerful but shadowy agents behind the scenes pulling the strings. It might be scary and depressing to believe that there is a vast, evil conspiracy secretly controlling events, but that can be oddly comforting because it leaves open the possibility that the righteous might one day take hold of the levers of power themselves. There’s a New Yorker cartoon that sums up the position that a lot of us find ourselves in. We know that there is probably not a vast conspiracy that has made the world as fucked up as it is, but we can’t help shake the nagging feeling that it sure looks as if someone planned it. The cartoon shows a lone guy protesting on the street with a placard that reads, “We are being CONTROLLED by the random outcomes of a complex system.”

For some people conspiracy theories undoubtedly fulfil psychological needs, especially in times of crisis, conflict or rapid social change. The stories of how a particular individual came to embrace full-blown conspiracism are regularly fascinating and moving. [Psychologists now tend to think[(https://doi.org/10.1177%2F0963721417718261) that belief in conspiracy theories is not the product of abnormal psychology, but the result of cognitive biases that we all share to a greater or lesser extent, coupled with specific emotional and social needs. We are attracted to explanations that promise to make sense of the seeming randomness and complexity of current affairs; we like to feel that we are one of the clever few who have managed to see through the lies and manipulation; we are drawn to theories that make us feel not so powerless; and we reach out for compelling accounts of why our particular group or nation is being victimised. But these insights into the psychological mechanisms at work downplay other social and political reasons why sizeable numbers of people are attracted to conspiracy thinking in particular historical moments. People believe in conspiracy theories not (or, not merely) because they are misinformed or stupid or crazy or their brains are hard-wired to see patterns, but because conspiracy theories fulfil the need to find someone to blame for genuine problems in society. However, we also now need to be alert to the possibility that there are malicious groups (both foreign disinformation units, and domestic political groups and alt-right trolls) engaged in campaigns of so-called coordinated inauthentic behaviour on social media to promote conspiracy theories and other forms of “problematic information.” Often the motivation is not to champion one particular alternative view but to sow the seeds of doubt about all evidence, science and expertise. The aim of polluting the online information environment is to increase distrust, stoke resentment and destabilise society, and this might well be the most damaging effect of online conspiracism.

Likewise, we need to think about the financial incentives of the “conspiracy entrepreneurs” who make a healthy living from promoting conspiracy theories, along with their side-line in snake-oil cures (e.g. “Magical Mineral Solution” and “Colloidal Silver”). Professional charlatans such as Alex Jones and David Icke make a living from peddling their speeches, books and other merch, and it was no surprise to find this latter veteran conspiracy-mongerer jumping on the bandwagon of the coronavirus pandemic with his ready-made conspiracy explanations that mix bizarre alien fantasies with all-too-familiar antisemitic myths. Finally, we need to be alert to the possibility that sometimes conspiracy theories are not the sincere expression of a deeply held belief, but are a pragmatic, tactical stance that people adopt to help bolster other positions they do genuinely believe. For example, research has shown that climate change conspiracy theories are often used strategically by those opposed to the political consequences of recognising climate change as real. If you are as a matter of ideological faith against government regulation of markets, then it’s politically convenient to claim that climate scientists are corrupt and it’s all a hoax.

The stereotypical picture of the conspiracy theorist is a socially awkward guy in his parents’ basement, a keyboard warrior wearing a tin foil hat. But research has shown that this clichéd portrait is not entirely accurate. In general, men are no more likely to believe in conspiracy theories than women, but it all depends on the particular example. Surveys show that most hard-core moon landing conspiracy theorists are men, for example, but anti-vaxxers are more likely to be women. In a similar fashion, there’s not that much difference in general between young and old, black and white, religious or not when it comes to conspiracy belief, but once again it depends on the particular case. The only significant difference comes with income and education: the richer and the better educated you are, the less likely you are to believe in conspiracy theories. In the case of coronavirus conspiracy theories, for example, a recent survey in the US found that 48% of those with only a high school level of education think it is probably or definitely true that powerful people intentionally planned the COVID-19 outbreak, whereas only 15% of those with a postgraduate degree think that is the case. The only other significant predictor is that if you believe in one conspiracy theory, you tend to believe in many – which makes sense, if you start from the conviction that everything is connected.

But what about political belief: are those on the right wing more likely to believe in conspiracy theories than those on the left? Again, it all depends on context, not least where you live and what’s happening politically. Belief in conspiracy theories is often partisan, with people – unsurprisingly – more likely to believe in conspiracy theories about the authorities when the party they identify with is not in power. (The exception to this rule is Trump, of course, who promoted conspiracy theories about Obama and Hillary Clinton when he was on the campaign trail, but continued to do so while in office.) Research in a number of countries indicates that in general conspiracy belief is higher at the extreme ends of the political spectrum. However, there are reasons to think that there is increasingly a connection between conspiracism and right-wing politics. If you think that, as Ronald Reagan famously said, government is the problem not the solution, then it stands to reason that you might well view any encroachment of the “nanny state” into your personal life as part of a bigger conspiracy to deprive you of your freedoms.

Conspiracy theories have a long history, but have the internet and social media made conspiracy theories go viral? There are good reasons to think that the internet and conspiracy theory are made for one another. Not only is it simple for anyone to distribute professional-looking materials online with virtually no gate-keeping and at incredible speed, but it is now easy to find a like-minded audience in ways that were unthinkable in the past. Some commentators have suggested that conspiracy theorists often become trapped in digital “echo chambers” where they only engage with like-minded fellow believers. This is coupled with the power of search engine results to create a “filter bubble” effect, in which individuals only receive information that reinforces their blinkered worldview. While this is undoubtedly sometimes the case, the online world is far more diverse than the filter bubble and echo chamber theories suggest. Search engine results are rarely completely uniform, and online communities are seldom totally immune to outside influence. People’s media diets are in reality quite varied. When an echo chamber does emerge online, it is not necessarily caused by the inherent nature of the technology itself but by a process of social self-selection by participants that is also visible in the offline world. Likewise, there is a tendency to exaggerate the power of online communication, suggesting that viral memes – like actual viruses – can take over the mind and body of a vulnerable recipient, brainwashing them. Those who engage in online conspiracy communities are far from passive, and we therefore need to understand both their personal involvement but also the group dynamics that particular platforms generate.

However, fuelled by the financial incentive of encouraging ever more divisive, emotive and engaging content, the recommendation algorithms of social media platforms can end up pushing some users down the rabbit hole of radicalisation. With their seductive rhetoric, conspiracy theories play a central role in this process. The social media companies have been slow to acknowledge the role that their platform design choices play in encouraging the spread of harmful misinformation and hateful extremism, hiding behind the defence that their algorithms are merely giving people more of what they like. But this ignores the tendency of the recommendation algorithms to promote content that is ever more extreme. In the case of Dylann Roof, who killed nine African Americans in a church in Charleston in 2015, detectives were able to reconstruct his browser history, showing his online journey into violent white supremacism. In the face of a public outcry about this and other mass shootings in which the gunman had clearly been heavily invested in online racist conspiracy-mongering, social media platforms such as YouTube began in 2019 to remove some conspiracist content and reduce its prominence by changing their algorithm. With the coronavirus pandemic, the platforms have taken a more proactive stance on content moderation, removing material that promotes harmful medical information relating to COVID-19. In October 2020, Facebook, for example, announced that it will ban ads that merely discourage people from getting vaccinated, tightening up their earlier ban on ads that actively promoted vaccine misinformation. But the volume, speed and viral spread of misinformation means that often the platforms are trying to close the stable door long after the horse has bolted. Ultimately, their business model is based on stoking controversy to generate engagement and advertising revenue, and conspiracy theories fit the bill perfectly.

If, as we’ve been arguing, conspiracy theories are highly resistant to correction, no amount of fact checking, flagging mechanisms and promotion of accurate information on the part of the platforms are likely to make much difference. Those approaches are just as likely to make red-pilled conspiracy theorists dig in their heels, convinced that Silicon Valley is itself part of the conspiracy to suppress the truth. Conspiracy theories about the coronavirus are spreading not so much because people are unable to access vital information, but because they distrust official sources of information – even fact checkers. That doesn’t mean we should give up on putting out correct information about COVID-19 and linking to point-by-point debunking of conspiracy myths, but we need a sense of realism that the facts won’t simply speak for themselves and win the argument.

So, what can we do about conspiracy theories in the time of corona? First, independent regulation of social media platforms is vital, although we have to recognise that it is not a panacea, and it needs to be nuanced. Outright deplatforming is sometimes necessary for content that clearly promotes hatred and violence, but making borderline problematic content harder to find or demonetising it might be enough to help stop some stories going viral. One of the investigations we are running on the Infodemic project is into the effectiveness of the various changes that internet companies have introduced during the pandemic. Social media platforms need to change their algorithms to ensure that they are not actively promoting harmful conspiracy materials, and they need to allow independent auditing of their black-box technologies. Second, we need to choose which battles to fight. Hard-core believers often make up only a small percentage of the total number of those who show an interest in a conspiracy theory, and they might well be a lost cause. It therefore makes more sense to engage with people who don’t fully believe in a theory, but don’t fully disbelieve in it either. Teaching analytical thinking skills and digital media literacy are undoubtedly an important tool in the fight against the pollution of the online information ecosystem, but they have their limitations. For one thing, conspiracy theorists often seem to have learned the lessons of information literacy all too well: they are the first to cast suspicion on a story in the press, pointing out the vested interests and the techniques of persuasion.

But this might give us our first way in. If you’re so sceptical, this line of engagement goes, then maybe you need to be a bit more sceptical about your own beliefs and sources of information, including taking a closer look at the financial motives of conspiracy entrepreneurs, and getting them to consider with a more sceptical eye what else would need to be the case if there really was a secret cabal pulling the strings behind the scenes as they claim. Of course, there is no guarantee that this approach will have any effect, but it has the advantage of opening up a conversation, rather than instantly descending into a face-off of my facts against your facts. Establishing a sense of connection with a conspiracy believer is crucial. Tempting though it is to ridicule anyone willing to even entertain such ideas, we need to show a bit of empathy. We need to understand that conspiracy theories can be a way for people to give vent to a sense of grievance about the injustices of the world (or, at the very least, their own situation in life). Those grievances are often very real, even if the specific theories and scapegoats are wide of the mark. Conspiracy theorists are often motivated by a sense of justice or patriotism or anger that we all can identify with, even if we think that their explanations of what is happening are completely mistaken. We also need to recognise the pleasures and thrills of conspiracy theorising, to try to understand why these kinds of story are so appealing to so many people. It’s unlikely, however, that the popularity of conspiracy theories is going to diminish unless people have more reason to trust that we are all, genuinely, in this together.

This post is published under the terms of the Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence.