We researched Russian trolls and figured out exactly how they neutralise certain news

  
We researched Russian trolls and figured out exactly how they neutralise certain news
Credit: BeeBright/Shutterstock

Russian "troll factories" have been making headlines for some time. First, as the Kremlin's digital guardians in the Russian blogosphere. Then, as subversive cyber-squads meddling with US elections.

While there has been much sensationalist talk about troll brigades, there have also been thorough investigations of first party sources and genuine leaks. Indeed, some (mostly former) Russian trolls have been willing to talk.

We now know that at least some of those who have come out from the shadows were not taking the political agenda they were tasked with promoting all that seriously. We also know, in some detail, the internal organisation and work schedule of the so-called "troll farm" Internet Research Agency – where most whistleblowers used to work. As well as quantity-oriented commenters and bloggers, the agency employed skilled researchers who spoke foreign languages and undertook high-quality investigative work.

A few statistical analyses of large samples of trolling posts also show that institutionalised political trolling and the use of bots have become a consolidated practice that significantly affect the online public sphere.

What has been shrouded in mystery so far, however, is how institutionalised, industrialised political trolling works on a daily basis. We have also lacked a proper understanding of how it affects the state's relations with society generally, and security processes in particular.

Neutralising trolls

For our recently published research, we wanted to understand what pro-Kremlin trolling does and how it works in the Russian blogosphere. We analysed how investigative journalism of trolling gets trolled, worked our way through the trolling trails generated after the assassination of Boris Nemtsov – Russia's unofficial opposition leader – and interviewed a former employee of Internet Research Agency in a series of online chats.

During this research we found a distinct phenomenon which we called "neutrollization". This authoritarian practice co-opts trolling as an, in principle, anti-establishment (if inflammatory) activity, and turns it into a method of regime consolidation.

Neutrollization prevents civil society's attempts to expose the regime as a security threat by creating conditions where political mobilisation becomes absurd, so any risk to the regime is neutralised. Meaningful political engagement only "feeds the troll" – that is, it gets sucked into the trolling spiral of ironising the public sphere.

Trolls in action

Unlike conventional operations of propaganda, neutrollization does not advocate a distinct political agenda. Pro-Kremlin trolls generate a stupefying noise through internet activism which seems to originate from citizens. They spread various conspiratorial theories and create a quasi-political, yet completely hollow, public space with a multitude of diverse but prefabricated opinions that jam the web.

This is precisely how some sections of the Russian blogosphere were neutralised after the assassination of Boris Nemtsov. In March 2015, newspapers Moy Rayon and Novaya Gazeta leaked a list of more than 500 troll accounts, together with instructions that the trolls had been given on how to approach the event. The papers also published lists of corresponding key words that the trolls were told to use in order to facilitate searchability.

The instructions included proliferating the view that the murder of Nemtsov was a provocation and that it was not beneficial to the official authorities. Trolls were also told to broadcast the alleged PR benefit to the opposition of the death of their comrade, and the involvement of Ukrainian persons in the assassination. In addition, they were told to criticise Westerners' interference in Russian internal affairs, and to suggest that the murder was being used as an excuse to put pressure on the Russian Federation.

The objective, in other words, was not to put the blame on any concrete political opponent. The interest was not in finding an actual assassin. The logic was to imbue the discussions with such contradiction and filth that any bona fide user felt disillusioned and despondent. This flooding effect deters the audience from taking anything seriously.

Vitally, neutrollization plays on citizens' own critical faculties by first drawing them in and then confusing them. It is not about merely pulling the wool over their eyes, and it has little to do with coercion or silencing. Instead, it exploits and twists the idea of self-expression and citizenry action in a way that leads to withdrawal from politics.

Unlike the more common forms of propaganda – which see mass media encouraging support for the political system – neutrollization encourages cynicism. All the while trolls preserve the semblance of sincerity and authenticity by following instructions. They cannot be "convinced" as their task is to implode any meaningful conversation.

This position makes it near impossible to blow a whistle on a troll. But exposing trolls as professionals of nihilism is insufficient anyway. They are but precarious labour in a powerful political strategy.

Neutrollization isn't limited to within Russia's borders. It is increasing internationally, too. The deployment of bots to disrupt political dialogue is just one example of the spillover. And while this does not have the same power as an operation backed by the trolled nation's own government, this strategy can wreak havoc.

Explore further: How do you spot a Russian bot? Answer goes beyond Kremlin watching, new research finds