Mistrust Makers: The Structure of the Kremlin's Disinformation and Astroturfing Campaigns

Re: Russia
In recent years, Russia has become the world's largest producer of astroturf. That is they create content based on lies, which is then disseminated by armies of trolls and bots, creating the illusion of mass demand for the Kremlin's alternative populist agendas. A new  investigation by Insider reveals the mechanism used to flood TikTok with provocative content via a network of corrupt bloggers. The bigger issue, however, is how effective these campaigns are. According to research, their direct impact is minimal because this content is primarily popular among people who already hold views of this type. Their secondary effect, however, is no less significant: they do not successfully change the minds of those who hold opposing views, but rather sow mistrust and a sense of vulnerability in the sustainability of normative and liberal social structures.

The Kremlin has used large-scale social media disinformation campaigns as a weapon in its confrontation with the West, as well as a tool of political control within the country, successfully supplementing television, which has gradually been losing its grip on information for years. The number of posts from EU countries justifying Russia's actions increased manyfold in the early days of the Russian invasion of Ukraine. A study conducted by Hungarian experts, previously covered by Re: Russia (‘Kremlin trolls never sleep’) examined the agendas and narratives pushed on Facebook by the Kremlin during the invasion, analysed according to country. 

The Insider's recent investigation, ‘TikTok in the Service of the FSB,’ reveals how the authorities have fostered a generation of paid bloggers on the social network popular among teenagers, flooding it with hashtags like ‘RussiaForward’ and other Z slogans. The bloggers praised the Russian president's wisdom and persuaded readers that the sanctions were ineffective. ‘The service's own statistics show that in the first few months, Tiktoks with the hashtags #zанаших (for our boys) and #мненестыднo (I am not ashamed) have garnered over 2 billion views. TikToks promoting Wagner PMC have received a billion more views on top of this’ the investigation's authors write. The investigation, however, revealed that the average cost of a video on a given topic is around 100 euros. TikTokers are willing to accept ‘state orders’ for a series of videos, working with their customers to approve scripts.

The scale of similar Kremlin campaigns has long caused concern and panic among Western political establishments. This resulted in a wide-reaching investigation into Russian meddling in the 2016 US Presidential election. At least 32 million Twitter users in the United States were potentially exposed to content from Russia-sponsored accounts in the eight months leading up to the election. According to Facebook, 126 million users may have viewed Kremlin-sponsored content. Because 3.5 times as many Americans were users of Facebook than Twitter in 2016, the share of Kremlin-sponsored Russian content on both platforms was roughly the same. Researchers and politicians agree that the Kremlin's extensive campaign aimed, on the one hand, to influence the behaviour of American voters by swaying social media users in Trump's favour. On the other hand, it continues seeking to undermine American democracy as a whole, thereby encouraging polarisation in American society. 

Despite the size of the Kremlin's social media campaigns, a question still remains over their effectiveness — that is, their actual impact on the behaviour of American voters. A large-scale study published recently in Nature magazine addresses these concerns. Social media users had access to content from alternative sources that was an order of magnitude greater. Respondents received an average of four posts per day from Russian sources in the final month of the election campaign, compared to 106 and 35 from US media and politicians, respectively. Examining the average number of posts from Russian accounts per week, the researchers concluded that the median impact of Russian accounts on users was zero in the final month of the election. This suggests that the impact of the Russian social media campaign was limited to a small number of users. 

This assumption is supported by an examination of the cumulative distribution of exposure to messages from Russian sources among respondents: 70% of content from Russian accounts was accounted for by 1% of social media users, while 10% accounted for 98% of all content. According to the findings, the stronger a user's identification with the Republican Party, the more posts of Russian origin they saw in their feed. According to the researchers, this approach appears to be completely ineffective: in order to increase votes in Trump's favour, the impact should have primarily targeted the undecided and wavering, rather than those who would almost certainly vote for him already. The researchers examined the relationship between seeing Russian-origin posts and changes in respondents' positions on eight key US policy issues (the Affordable Care Act, raising customs tariffs on Chinese products, building a border wall with Mexico, calls to bar Muslims from entering the US, and so on), as well as their perceptions of societal polarisation. The researchers concluded that there was no statistically significant evidence that content from Kremlin-backed accounts influenced or changed user attitudes. 

Interestingly, Russian researchers studying the Kremlin's astroturfing strategies in the Russian social media space came to very similar conclusions. According to a paper published on Re: Russia, the majority of activity of bots and trolls could be observed in networks and areas where users  already expressed extremely pro-Kremlin views. As a result, while the large-scale influx of this fabricated content does not serve to increase the number of Kremlin supporters, it does function indirectly to support it by demoralising opponents, who receive fabricated messages of ‘mass demand’ and a distorted view of the distribution of opinions among the population. 

According to the study published in Nature, there is no statistically significant relationship between the intensity of a campaign and changes in user preference distribution. However, the authors note that these social media disinformation campaigns can achieve some success due to their second-order effects. The Kremlin's actions elicited a negative public reaction, resulting in a debate over Trump's legitimacy and growing distrust in the electoral system as a whole. As a result, many Americans were willing to accept that there were instances of election fraud in 2020. 

In other words, the Russian social media campaign in 2016 was successful in convincing Americans that its scope and effectiveness were greater than they were, creating a sense of vulnerability, opacity, and mistrust. 

However, the researchers conclude that, in the future, foreign actors may adapt their social media behaviour to achieve more meaningful effects, and the study's findings should be used with caution when assessing the effectiveness of other disinformation campaigns. Furthermore, the ‘reflected’ effect of the study's campaigns should not be underestimated: by failing to ’change’ normative viewpoint advocates, they undermine their trust in society and their belief that normative and liberal social orders can be sustained in the long run.