CLAMPING down on conspiracy theories may not help tackle extremist views online, instead it might cause them to proliferate.

Shruti Phadke at the University of Washington in Seattle and her colleagues analysed 6 million posts from 60,000 people on social news aggregation site Reddit, as well as their memberships of user-created communities called subreddits, in an attempt to identify the roots of online radicalisation. All the people’s profiles were roughly similar, but half of them were members of at least one subreddit focused on discussing political and scientific conspiracy theories.

Phadke’s team found that downvoting or banning users for voicing controversial or inaccurate views was sometimes a precursor to people joining a conspiracy group, where they then faced little pushback and were further radicalised. Almost 9000 of those who eventually joined conspiracy groups had faced some sort of moderation, such as posts being removed, but only 3000 of those who didn’t join such a group had.

Having content moderated made it 6 per cent more likely that someone would join a conspiracy group. Having posts downvoted by other users made it 19 per cent more likely (Proceedings of the ACM on Human-Computer Interaction, doi.org/frvj).

“It’s as if they’re being shunned by other communities, getting ostracised, and then they go into these conspiracy communities and find a home for their thoughts,” says Phadke.

She believes that the solution is to make moderation explainable and to use “gentle nudging”, such as steering anyone expressing fringe views to reputable sources.

New Scientist asked Reddit about the findings but didn’t receive a comment.

The difficulties of moderating extreme or inaccurate views online have long been apparent. Conspiracy theories such as QAnon have proliferated online, and former US president Donald Trump’s tweets fell foul of Twitter’s terms of service this month and he received a lifetime ban.

During the early stages of the pandemic, social media platforms such as Twitter and Instagram began adding links to authoritative sources alongside users’ posts about covid-19, but amid widespread criticism for allowing misinformation to spread, they have also started banning content that they deem particularly harmful.

Jaron Lanier at Microsoft Research and author of Ten Arguments for Deleting Your Social Media Accounts Right Now says that banning “is the only thing that’s worked at all, as uncomfortable as it is”.

When Facebook banned far-right group Britain First in April 2019, for example, the group was forced to rely on smaller social media sites like Gab. On Facebook, the group had 1.8 million followers but on Gab it still has only about 12,300.

“Over time you do reduce the threat to society,” says Lanier.