Tackling Conspiracy Theories on Social Media During the Pandemic

Apr 8, 2021 —

Should we be worried about the spread of conspiracy theories on social media?

There has undoubtedly been a significant increase in the visibility and reach of conspiracy theories and other kinds of “problematic information” in the online environment. But we need to make sure that don’t give in to unfounded alarmism. Authoritative health institutions and traditional media have remained the most important sources of information, and trust in science and scientists has increased during the pandemic. As we saw in the last blog post, misinformation forms only a small part of the overall information ecosystem in the pandemic, and conspiracy theories are only a small part of that misinformation—according to one study, conspiracy theory makes up only 29% of the vaccine hesitancy discourse in English-language online spaces (although it is 59% in French-language spaces).

In terms of volume, conspiracy theories are comparatively minor, but their visibility and influence is far higher. During the pandemic, people who have previously shown little interest in conspiracy theories have encountered them at a scale we have rarely seen in the past. Despite efforts by the social media platforms to promote authoritative health information in search queries, their recommendation algorithms continue to push people to more extreme content and groups. For example, our research has shown how a simple query on Amazon for books on coronavirus returns mainly works of conspiracy theory, such as books by David Icke. Conspiracy theories are a particularly seductive kind of misinformation. In part, this is because on social media they involve active participation: the QAnon movement, for example, gained passionate followers because they were helping construct the conspiracist interpretation of events.

It is hard to change people’s minds about conspiracy theories because they are not usually the result of a lack of information, or faulty information. They are therefore not easily combatted with debunking, though fact checking is still important in the overall struggle against the misinfodemic. Instead, conspiracy theories are appealing because they provide a narrative that claims to make sense of everything in uncertain times. Part of their appeal comes from belonging to a community of like-minded people, and the infrastructure of each social media platform creates its own distinctive communities. Conspiracy theories start from the assumption that nothing is as it seems, that nothing happens by accident, and that everything is connected. They are often an expression of a deeply held worldview that is tied up with a person’s sense of identity, even if the desire to believe is cynically exploited by “conspiracy entrepreneurs” who (as we explained in a previous blog post) monetise scepticism, and by hate groups which are using the pandemic to recruit new followers. Challenging conspiracy theories is therefore difficult because in effect you are challenging someone’s identity, rather than simply correcting a false piece of information. In addition, the logic of conspiracy theories is often circular and irrefutable—there is nothing you can say against the conspiracy theory that could not be reinterpreted by a committed conspiracy theorist as evidence for the conspiracy theory.

The nature of conspiracy theories has changed in the last two decades. Instead of trying to persuade you to believe something that’s not true, conspiracy theorists now try to persuade you to not believe something that is true. Increasingly, the function of conspiracy theories is to delegitimise and disorient, by undermining our faith in scientific expertise, an impartial media, and democratic governance. This is partly a result of the growing influence of social media, but also because of the rise of foreign disinformation campaigns on the one hand, and the rise of home-grown populism on the other. Traditional sources of authoritative knowledge will be dismissed by the committed conspiracy theorist as themselves part of the conspiracy of elites. The real danger of conspiracy theories on social media during the pandemic is not a particular piece of misinformation here or there, but the pollution of the information ecosystem more generally.

Conspiracy theories are not caused by a peculiar psychological condition. They are fairly widespread in most societies, with surveys showing that over half the population believe in at least one conspiracy theory. For example, about 30% of people in the UK believe that the world is secretly ruled by a small group behind the scenes, while in the US that figure is as high as 40%. In the case of the pandemic, according to a survey conducted by Ipsos/KCL 15% of people in the UK think the purpose of the vaccine is to track and control the population, with another 15% undecided. For those who get their news mainly from social media, it’s between 30 and 40% (depending on the favoured platform), with the highest figures among young people. Belief that medical authorities are deliberately hiding information about the harms caused by vaccines runs to approximately 20% in the UK, but conspiracy-minded anti-vaxx sentiment is higher in other countries (e.g. in France it is closer to 40%). However, since the rollout of the vaccine in the UK, vaccine hesitancy (whether conspiracist or not) has reduced to 10-15%, although there are significant variations between different age groups and ethnic communities.

Instead of dismissing conspiracy believers as simply paranoid and delusional, we need to recognise that there can be legitimate concerns about, say, lockdown policy or vaccine safety. We also need to understand why particular conspiracy stories resonate—why, for example, some ethnic minorities might have good reason to be suspicious of government or medical authorities. More generally, conspiracy theories are often the result of a sense of resentment against the elites and grievance about a perceived loss of status, sentiments which we would be foolish to ignore. We cannot hope to effectively combat the growth of conspiracy theories unless we understand the underlying reasons for the erosion of trust in science, politics and the mainstream media.

What can we do about the spread of conspiracy theories on social media?

The pandemic has produced a potential tipping point in the regulation of online misinformation. Social media platforms have moved considerably in the last two years in taking action, first because of reputational damage in the wake of high-profile mass shootings and disinformation in election campaigns, but now because of the urgency of the pandemic. Some of the platforms have made progress in removing content that will lead to immediate harm, both medical and political. They have promoted authoritative health information, added content warnings, and deplatformed individuals and groups in some extreme cases. Yet their measures are not as effective as they claim to be. Research has shown, for example, that much deplatformed content is still easily available, with posts on the mainstream platforms linking to purged content and groups that appear elsewhere. While deplatforming can be effective in the most extreme cases, it does little to persuade conspiracy theorists to change their worldview. Instead, it often confirms their sense of persecution. For this reason, demoting harmful content in recommendations and demonetising repeat spreaders of problematic content are better strategies than deplatforming.

Focusing on content moderation ignores the fact that social media platforms have fuelled the problem, because they have a financial incentive in stoking controversy. They are not an altruistic public sphere, but ad-delivery engines that rely on recommendation algorithms to maximise engagement. Rather than trying to automate the removal of individual posts, we need instead to focus on infrastructure design rather than content alone.

The pandemic has shown that self-regulation by platforms cannot solve the problem. Instead governments need to introduce regulatory measures to force platforms to uphold the kinds of standards we expect from traditional media and other public fora. In that regard, the UK’s Online Harms white paper and the EU’s proposed Digital Services Act are promising. They will also require the platforms to be more transparent, by allowing researchers and regulators unrestricted access to data to independently verify the platforms’ claims about the effect of their interventions.

At the same time, we need to ensure greater transparency and honesty on the part of scientists, the media and the government. This involves admitting the limits of our knowledge and acknowledging when we get things wrong, not least because many conspiracy theories today are about scientific and political elites operating in cahoots with one another. It also involves actively building trust rather than merely asserting authority. More than anything, we need to understand why some people feel so disenfranchised and disillusioned that they turn to conspiracy theories.

This post is published under the terms of the Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence.