Conspiracy theories are about to go viral in new, murkier ways

Online conspiracy theories are surprisingly convincing – and present significant danger to the real world. At Jigsaw, a division of Google focused on countering digital extremism, cyberattacks and misinformation, we conducted more than 70 in-depth interviews in 2020 with people in the US and UK who believe the Earth is flat, that school shootings were government plots and that white populations are being intentionally replaced by non-whites.

Those beliefs lead to changes in behaviour: we found that people who believe disinformation about the origin of Covid-19 shun face masks and ignore social-distancing. In 2021, anti-vaccination activists will use the internet to warn about what they see as nefarious motives behind any pandemic-vaccination programme.

Social-media platforms have now started to respond to Covid-related, racist and xenophobic conspiracy theories by removing content associated with these from their sites. This will protect unsuspecting people from stumbling across them. But the propaganda itself will not disappear. In 2021, these ideas will resurface on “alt-tech” networks such as Gab, Telegram and 8kun, platforms that market themselves as being “anti-censorship” (read: unmoderated) and for “free speech” (read: hate speech welcome). For the first time since the dawn of the social-media age, online content will go viral elsewhere.

Moving dangerous content into the “alt-tech” world will present two dangers. The first is that the inaccessibility of these subjects could encourage people to seek them out. Spreaders of conspiracy theories will have the opportunity to sensationalise a “censored” video (“what they don’t want you to see”) – what is known in online marketing as a “curiosity gap”.

The second is that, ironically, the displacement of those ideas to more fringe platforms could help them spread, removing scepticism they may face from mainstream audiences.

Our interviews with hardcore conspiracists revealed the need to signal in-group status by voicing agreement and even engaging in one-upmanship by expanding on the conspiracy.

In 2021, we may feel reassured that social media companies are taking on a limited role as moderators. However, it is likely that the narratives with the greatest potential to cause harm will thrive away from the major platforms and, similar to QAnon in the US, find alternative routes into mainstream consciousness. Just because we can’t see them, won’t mean that they don’t continue to pose a threat.

Yasmin Green is director of research and development at Jigsaw, a division of Google

More great stories from WIRED

🐧 The mystery of the world’s loneliest penguins

🎲 Forget Monopoly. These are the best board games for adults and families

💻 Take control and stop yourself getting hacked in 2021

🔊 Listen to The WIRED Podcast, the week in science, technology and culture, delivered every Friday

👉 Follow WIRED on Twitter, Instagram, Facebook and LinkedIn