Beliefs in conspiracy theories can damage societies and individuals, but the only effective ways to counter them are impractical, according to a review
The vast majority of methods for quashing belief in conspiracy theories have little or no effect and the ones that do work are impractical. That is the conclusion of a review of 25 studies assessing various methods of tackling unfounded beliefs in secret plots.
Conspiracy theories, such as the untrue belief that coronavirus vaccines are a way to implant microchips, can affect people’s health or lead to antisocial behaviour, says Cian O’Mahony at University College Cork in Ireland. But while many studies have assessed ways of debunking false beliefs in general, few have looked specifically at conspiracy theories, he says.
They are particularly hard to debunk because anyone trying to contest them is seen as part of the conspiracy. “They say, ‘Of course, you will say that’,” says O’Mahony.
He and his colleagues decided to review the evidence so far to see what works and what doesn’t. They found just 25 studies meeting their criteria, which includes a definition of conspiracy theories as involving a belief that something is being actively covered up for a nefarious purpose.
Methods such as presenting rational counterarguments, ridicule or labelling conspiracy theories as such aren’t effective at countering either specific conspiracy theories or people’s general tendency to believe them, the review concludes. In fact, one study found that the labelling method backfired by slightly increasing conspiracy beliefs.
Priming methods that aim to boost people’s critical thinking before they are exposed to conspiracy theories did work, but not very well – the effects were usually small.
What did work well was prebunking or informational inoculation, in which people are told why a conspiracy theory isn’t true before being exposed to it. All studies testing inoculation found medium-sized or large effects.
But trying to “inoculate” people before they are exposed to conspiracy theories isn’t practical, says O’Mahony. It is also specific to each particular conspiracy theory. “It’s untenable to be able to constantly be updating people on the new conspiracies that are coming out,” he says.
The most effective method reported so far involved a three-month university course with weekly sessions in which students looked at the differences between sound science and pseudoscience. This course comes closest to what is needed: a kind of broad-spectrum vaccination against conspiracy theories based on teaching people how to think rather than what to think, says O’Mahony.
But few people are going to sign up for a three-month course and it could be that those who most need to attend such a course are the least likely to do so, he says.
Read more: We jump to conclusions even when it pays to wait for the facts
This kind of research is still at an early stage and more needs to be done before, say, trying to introduce something like the university course in schools, says O’Mahony. One major issue is that no studies have done follow-ups in the weeks or years after interventions, so it is unclear if any of the effects persist.
Stephan Lewandowsky at the University of Bristol, UK, sees the results in a positive light. “I am not surprised that many of the effects are small given that conspiratorial attitudes present a particularly difficult nut to crack. Many believers are very committed to their theories,” he says. “I also think that even small effects may scale up: reducing sharing of a conspiracy theory early on by a few percentage points may be sufficient to disrupt a cascade.”
Lewandowsky also says that inoculation isn’t necessarily limited to specific conspiracy theories and can be rolled out at scale on social media. His team demonstrated this last year in a study involving about 22,000 people on YouTube, and Google recently ran a large-scale inoculation campaign in eastern Europe, he says.
Journal reference: PLoS OneDOI: 10.1371/journal.pone.0280902