Researchers Have a Method to Spot Reddit’s State-Backed Trolls

The playbook to subvert democracy and sow dissention often starts with social media. And it happens on sites like Reddit, through accounts such as the aforementioned Bootinbull and JerryRansom, both of which were identified by Stringhini and his colleagues, trying to drip-feed a controversial message while using a stream of more regular and mundane posts as cover. Like Bootinbull, JerryRansom used the same cute animal photos and 4chan-baiting memes, then gradual slid into political discourse—with added posts in r/sexygirls. Notably, many of the accounts that Stringhini says have similar behavior to those definitively linked to Russia have previously posted on r/aww, which encourages users to share photographs that may prompt an “aww”-like response—often over cuddly animals.

The way the troll accounts behave is possible to discern through what Stringhini calls “loose coordination patterns.” With less sophisticated bot accounts, their nature can be identified through timing and the type of content posted—because they often pump out the same message from a number of different Twitter accounts that have either been specially created for the purpose, or coopted from innocent patsies through cyberattacks that steal their login details. But troll accounts require deeper analysis.

The troll method—which involves real, human beings behind the accounts, rather than preprogrammed bots—has become more popular as the old blunt automated tools lose their power. “There’s less obvious use of bot networks now, because I think they have been so documented, and people expose them all the time,” says Eliot Higgins, founder of Bellingcat, which documents and uncovers the use of such campaigns and open-source intelligence, often focused on Russia. “Trolls tend to be more impactful, because then they’re taking advantage of the kind of natural development of these communities online, rather than trying to build something from scratch, which is a lot more difficult to do.”

Instead, Russian trolls build up fake personas online, trying to ingratiate themselves into preexisting Reddit communities, and then move the conversation on to their true aims. Like Bootinbull and JerryRansom, they start with innocuous posts about dogs and animals before pivoting to geopolitics. The goal is to make the person behind the accounts seem more realistic, and more human—thus making it easier to seed the more contentious content. Their focus, says Stringhini and colleagues’ research, is on fractious social issues: Troll accounts leveraged the divide over Black Lives Matter, and in US presidential elections—arguably being a factor in propelling Donald Trump to victory over Hillary Clinton in 2016. Posts from Facebook troll farms were seen by 140 million Americans ahead of the 2020 election, according to an internal document compiled by the social media platform. Yet they’re also focused on topics perceived to ingratiate themselves into the Reddit community. “They are strictly pro-cryptocurrencies, and they advocate for it on social media,” says Stringhini, “while at the same time the same account may push some political discourse as well. They try to blend in.”

It’s all part of the handbook state-sponsored Russian trolls are given to operate under—and is one that is increasingly commonplace among a number of different countries. Stringhini points to Russia, China, Venezuela, and Iran as nations that are trying to shape conversation through organized social media troll campaigns. Yet despite trying to give off the veneer of normality, the academics have found some tells that could suggest inauthenticity. Troll accounts tend to post less than 10 percent as many comments as a “real” account on Reddit, based on a random sample, suggesting that the pretense of reality is difficult to keep up for a long time, or that they give up when they think their work is done. Conversely, they’re more willing to broadcast out than to take part in conversation: They make an average of 42 submissions during their lifetime, compared to a non-troll account’s 32. Most tellingly, and in line with the way deep-cover spies are often found out because they end up meeting known spies in a dead-drop situation, state-sponsored social media trolls are often found out because they can’t help but post on each other’s threads.