Bots On Russian Social Media: How Network Propaganda Works by

Alesya Sokolova
Novaya Gazeta Europe
Re: Russia
Alesya Sokolova, Re: Russia

Bots and trolls are entities we deal with every day, usually without noticing or realising it. While their numbers relative to the total user population may not be very high, their activity, consolidated stance, and aggressiveness allow them to capture and moderate discussions, triggering an effect similar to the ‘spiral of silence’ phenomenon. Bots and trolls are horizontal network propaganda tools designed to distort real users' perceptions of the nation's 'imagined majority'. How are propaganda bots and trolls organised? How many of them are there? How do they behave, what do they write about? Who do they impersonate? The machinations of propaganda trolling and bot propaganda in the online environment, particularly in Russia's most populous social network, VKontakte, is the subject of a joint study by Re:Russia and Novaya Gazeta Europa, various parts of which are published today on both platforms. 

Read the overview of the study, ‘Neural Networks for Putin’, on the Novaya Gazeta Europa website.

In the early 2010s, social networks solidified their place in public life, becoming spaces of freedom and communicative openness. Since the Arab Spring and the Russian protests of 2011-2012, social networks have played a significant role in mobilising opponents of autocracies. They made the process of finding and mobilising like-minded people extremely easy. A little later, from the mid-2010s, they also began to serve as a distributed channel for delivering alternative and uncensored information, transforming into social media that organised content navigation. As a result of these properties, they became a major problem for authoritarian regimes.

However, since the mid-2010s, social networks have become an arena for autocracies to counter-attack, mastering new propaganda techniques, such as the propaganda of complicity and astroturfing (campaigns simulating ‘popular opinion’). Essentially, this marked the onset of a new era of network propaganda. Social networks were populated by bots and trolls, distorting the picture of the real social environment, its moods, agendas, and preferences. Propaganda mimicked the opinion of the ‘average’ person. The propagandistic message was no longer just broadcast from a television screen but came to the user in the form of a reaction to a message from an imaginary ‘neighbour’ in the network discussion, thus provoking complicity. In other words, the vertical model of propaganda (top-down messaging) is supplemented by horizontal propaganda of complicity, organised by bots/trolls. As a result, when entering a social network, the user encounters a picture of ‘public opinion’ that is significantly distorted by this injection of propaganda, which forms a skewed image of an ‘imaginary majority'.

Moreover, bots are not the only tool for distorting the network space. As Grigory Asmolov has noted in his article on Russian online propaganda, repressive authoritarian regimes use a combination of three tools that work in tandem, complementing one another (→ Grigory Asmolov: Propaganda in the Network Environment). The first is the repression of real users for the dissemination of undesirable content. According to the Network Freedoms Project, in 2022 alone, 779 people were prosecuted for social media posts. The second is censorship, which operates by blocking unwanted content (see Citizen Lab's analysis of VKontakte's content blocking strategy). These two tools are designed to suppress the ‘wrong’ opinion and minimise the ‘audibility’ of the corresponding position as much as possible. The third tool — bots and trolls — is designed to moderate the discussion, reacting to ‘wrong’ messages and ‘saturating’ the network space with ‘correct’ content that mimics ‘popular’ opinion.

The population and troops. ‘Evil’ Kremlebots

According to the Levada Centre, VKontakte is visited by 66% of Russians who use social networks at least occasionally. The Mediascope research centre has provided an even higher estimate: 71% of Russians visit this social network at least once a month. According to this data, people aged 24 to 44 spend the most time on VKontakte.

Furthermore, VKontakte, which is fully controlled by the Russian authorities, is the most integrated into the Kremlin's political management machine. In 2021, the company was headed by Vladimir Kiriyenko, the son of the First Deputy Chief of Staff of the Presidential Administration. The Kremlin is placing all its bets on VKontakte, seeking to turn it into a multifunctional mechanism for controlling public sentiment and social loyalty. A number of recent investigations and studies have uncovered a pattern of manipulation of online social practices, showing how VKontakte is being used to distribute pro-war content to school and kindergarten groups and to publish pro-war posts on behalf of state employees, as well as how the Kremlin is making efforts to lure audiences from YouTube which is a more oppositional site. 

We examined more than 7 million comments left by bots on VKontakte in 2023, as identified by the Botnadzor project. Based on the data from Botnadzor, in 2023, 120,000 bots wrote 7,312,000 comments in VKontakte. We classify bots not only as purely automated accounts that post comments without the participation of live people, but also as employees of ‘bot farms’, which are sometimes also called trolls. ‘Botanadzor’ has been calculating bots/trolls by similar activity patterns, after which most of the bot accounts are validated manually.

‘Botanadzor’ is engaged in regular monitoring of VKontakte posts for bots. The team checks about 2500 different public pages every day, collecting information from them about comments to all written posts. This list includes news and political pages (e.g. RIA Novosti, TASS, RBC, Lentach, PostNews), regional groups and media, as well as other groups popular with bots (e.g. the ‘timeline’ of the page ‘Podslushano’). In addition to regular monitoring, Botnadzor searches for posts on news and political topics that do not belong to a fixed list of blogs. These are topics related to Ukraine, Israel, military drones, as well as news about various political figures, etc

All identified bots were classified by Botnadzor into the following categories:

- pro-governmental (Kremlebots);

- oppositional (‘elves’ - most of them seem to be associated with the Free Russia Foundation’s ‘Legion of Elves’);

- regional (creating a positive image of local authorities in various regions);

- advertising (under the guise of real people writing reviews of products, mostly advertising medicines in groups for mums and pregnant women);

- other (smaller) categories.

A more detailed analysis allows us to distinguish two subgroups within the group of pro-governmental bots — ‘good’ and ‘evil'. ‘Evil’ Kremlebots are the most numerous category, they occupy the absolute majority (73%) of all bot comments. In terms of content, they are usually militaristic anti-Westernists, emotionally charged ‘patriots’, and their comments usually have a pronounced negative tone. In contrast, the ‘good’ Kremlinbots are characterised by an almost complete absence of aggression and militarism. They write positive comments about the situation in Russia and also claim that Russia wants peace (which Zelensky is allegedly preventing). Most likely, most of the ‘good’ Kremlebots are automated (the comments are written for them by a neural network), while there are real people behind the ‘evil’ bots, and their task is to react to ‘wrong’ content and attack the messenger. Thus, the ‘evil’ Kremlebots are the main propaganda agents within the VKontakte network. Their function is not limited to broadcasting the propaganda message, but consists of moderating the discussion, and within this they seem like a group of fierce and voluntary support.

Types of bots on the Russian Internet, % of comments

Bot density

All categories of bots, except the ‘good’ (automated) Kremlebots, are more likely to write comments on weekdays than on weekends. This again indicates that their activity differs from the natural behaviour of social media users.

Bot activity, 2023-2024, number of comments

If we consider all the posts (including those in small and/or non-political communities) under which bots wrote comments, then according to the data for a day (randomly selected in February 2023) 9% were bot comments, and bots belonged to 2.3% of the accounts that appeared in the comment section. Thus, the average bot account produces several times more comments per day than the average real user. Kremlebots and ‘elves’ are the most active accounts, producing four to five times more comments than real people. Thus, the share of bots in the total volume of posts within a discussion is about 10%, but due to the ideological cohesion of their comments and, often, their aggressive tone, they significantly influence the course of the discussion, demotivating real users to continue the conversation and, therefore, limiting their presence (this is partly reminiscent of the ‘spiral of silence’ effect identified by sociologists).

Bot activity, 2023, average number of comments per day

Bots predominantly engage in commentary within news communities, often affiliated with state or loyalist media outlets such as RIA Novosti, TASS, RBC, and others. Among the 45 news communities monitored regularly by Botnadzor in 2023, bots accounted for a staggering 72% of all comments. Within these forums, bots contributed to 22% of the total comments, representing 6.3% of accounts that had ever posted comments. Notably, within these communities, the proportion of bot-generated comments surged in 2023, which can be attributed to a decline in real user activity rather than an increase in bot engagement. While bots initially comprised one-fifth of all comments at the start of the year, by the end of the year, they made up a quarter. This trend likely signals a waning interest or desire among genuine users to partake in such discussions over the course of the year.

User and bot activity within news communities, 2023, weekly message count

'Evil' Kremlin trolls: message themes and communication strategies

The largest category of bots, responsible for 74% of all bot-generated comments in 2023, are the so-called 'evil' Kremlin trolls. The volume of their messages peaked early in the year, then declined. The reasons for such dynamics remain unclear; other types of bots and real users do not exhibit such peaks and troughs. There may have, perhaps, been some administrative restructuring or a break in funding. From mid-March onwards, the number of messages they posted began to rise again, reaching a new peak in September, only to decline again by the end of the year. A similar peak in mid-September can be observed among other pro-government bots, suggesting a possible link to the regional elections held on September 10th.

What are the 'evil' Kremlin trolls writing about? Their primary theme, undoubtedly, is the war in Ukraine and the 'geopolitical confrontation', i.e., the West's role in the war, its influence on Ukraine, and the course of the conflict. These two topics accounted for about 50% of all statements made by 'evil' Kremlin trolls. Data on frequently used words further corroborate the findings of thematic modelling: compared to other categories of bots that address political topics, 'evil' Kremlin trolls much more frequently use words such as 'Ukraine', 'Russia', 'AFU' (Armed Forces of Ukraine), 'USA', 'West', 'military', 'Zelensky', and much less frequently use words such as 'war' (a term still not used by Russian official sources to describe the events in Ukraine), 'power', 'Putin', 'child', 'price', 'official'.

Analysis of the number of messages that mention words such as 'Ukraine', 'Zelensky', 'West', 'AFU', and war-related terms shows that military and anti-Western themes account for an average of 36% of messages from 'evil' Kremlin trolls. Moreover, mentions of Putin by bots were rare throughout most of the year, occurring in only 1-3% of cases, although in December, the share of such mentions increased to 6%, which may possibly have been related to the start of the election campaign. Nonetheless, the absence of Putin's image in bot discourse generally presents an intriguing picture.

Themes of 'evil' Kremlin trolls, 2023, Message count

To determine the main themes within their messages, we utilised the Bertopic model. As, first of all, there were too many bot comments, and second, 'evil' Kremlin trolls wrote remarkably similar comments in disproportionately large quantities, we initially compiled a dataset from the raw data on which we trained the model: this included 20,000 bot comments from each category and 50,000 comments from 'evil' Kremlin trolls. After training the model on this dataset, we applied it to all the data to predict the category of each comment. To obtain a more comprehensive picture, including comments that could not be automatically classified, we manually classified a random sample of 400 messages from the 'other' category. In most cases, these messages fell into one of the automatically identified themes but were not identified due to the nature of live dialogue as the model lacked context. Thus, the portion of messages not automatically identified does not represent a separate thematic class but is primarily distributed among the identified thematic groups. In the graph below, the proportions of themes are presented with consideration paid to the 'other' category comments, based on the manual classification of 400 messages.

Themes of 'evil' Kremlin Trolls, 2023, % of message count

When discussing the war, bots strive to create an impression that the situation on the front lines is favourable for the Russian side, conveying a sense of emotional intensity and conviction among their authors. As evident from the examples below, one of the central narratives revolves around combating Nazism. However, by the end of the year, the proportion of messages which mentioned 'Nazism' and 'Nazis' decreased threefold, falling from 6% to 2% of the total number of Kremlin troll comments.

'Excellent! This means they have even fewer supplies there, soon they'll run westward begging for alms again'. 'The unfinished Ukrainian Nazis must be sent to Valhalla as soon as possible for their war crimes!' 'Lately, Russian UAVs and missiles have been notably active. Those Ukrainian Armed Forces are really taking a beating'. 'Russia will never forget the atrocities committed by the Nazis! And the Nazis in Ukraine today are no different and will be defeated by our troops!' 'Our tank brigades have shown the enemy the might of the Russian Federation. They were hitting the enemy fortifications hard!

The second most popular theme among the comments is Western politics and its instrumental use of Ukraine. The proportion of messages mentioning the West remained almost unchanged throughout the year, hovering around 7%, which is similar to the proportion of messages mentioning Ukraine and Zelensky, which remained around 17%.

'If the USA hadn't stuck their rotten hand into this conflict, nothing like this would have happened.’ 'Ukrainians are fleeing from Nezalezhna [a Russian derogatory term for Ukraine] in such numbers that Zelensky even had to close the borders for women.’ 'The West itself is to blame for starting the aggression against Russia first and now they will suffer the consequences.’ 'It seems true that the West is ready to fight to the last Ukrainian.’ 'By delaying the Special Operations Forces, Zelensky is plunging the country into a huge economic crisis. After the special operation, Ukrainians will be working for a hundred years just to settle their country's debts'. 'No matter what the Kyiv regime does, it won't help them. Whatever they do, whatever they receive, and whatever they undertake, their failures will continue in their plans.’

Yet another notable theme among bots, albeit significantly less popular than the previous two, is various comments on values, ideology, and Russia as a whole.

'Well, everything's right, Russia will continue to expand cooperation with all friendly countries'. 'In Russia, every nationality and tradition is respected, it's our united Russian people'. 'There's absolutely no doubt about it, Russia is the best country in the world'. 'What can you even talk about if Borrell claims that by banning Russian media they are preserving freedom of speech?’

The moderation of discussions, essentially engaging in polemics with those who post ideologically incorrect content and attacking their authors, is essentially the primary function of bots. They are meant to create a sense of discomfort among ideological opponents and, conversely, to provide support to those expressing the correct position. Therefore, responses by bots to other commentators, including insults or support for their polemical efforts, are separated out into a separate category. Overall, such comments account for over 20% of statements by bots (the categories 'expression of negative emotions' and 'neutral friendly communication'). Moreover, the tone of 'evil' Kremlin troll comments is more than three times as likely to be negative than positive (for determining the 'mood' of bots, we utilised a sentiment analysis model, RuBERT). Thus, they're labelled as 'angry/evil'.

'Pig, stop squealing for no reason. Air defence systems shoot down the vast majority of UAVs'. 'I'm laughing out loud, the Ukrainian Armed Forces are cruel killers, they go against the peaceful people, it's impossible to list all their war crimes. Maybe you're confused, but our army is greeted with flags, cheers, and flowers! And these creatures should be driven away!' 'Well yeah, what did I expect from a Ukrainian? You're constantly resorting to personal attacks because you're incapable of anything more. Loser'. 'Are you sure you're not an idiot? Who are you to divide people by nationality? Simply outrageous!' 'That's your subjective opinion, you should understand that'. 'It's really cool that you can observe the implementation of a large-scale air operation, thanks for such an opportunity'. 'You're right, that's why society and our people should unite and work for the good of our country, especially when the authorities provide all the opportunities for it!

The tone of 'evil' Kremlin trolls, 2023, % of message count

The data from sociological surveys indicates that in the second half of 2023, and especially towards the end of the year, the number of real war supporters in Russian public opinion was diminishing, while the 'demand for peace' was increasing (→ Re:Russia: War Deadlock). And the behaviour of bots to some extent reflected this trend. In the autumn of 2023, the share of bot messages on war-related topics in VKontakte began to decline, reaching 22% in November-December, compared to around 32% in the summer months. Conversely, there was an activation of geopolitical discussions by bots in the autumn ('Ukraine and the West'), and the proportion of their statements on social topics increased from 15% to 20%. Thus, the behaviour of bots reflected a certain diminishing of the war theme in the period immediately after the Prigozhin rebellion. However, their overall activity also decreased.

Dynamics of the main thematic groups of bot statements, 2023, % of automatically classified messages

Habitat: Kremlin trolls engage with the public

As we have already mentioned, Kremlin trolls leave the most comments on state news media pages on VKontakte. The top locations by the number of comments include RIA Novosti, RT News in Russian, RBC, Gazeta.Ru, Vesti, IZ.RU, and Readovka. Accordingly, the proportion of bots in the overall comment flow is higher in these spaces. For instance, in the more neutral anonymous news aggregator 'No. — News of the Day' and in the oppositional public 'Lentach', the share of comments from pro-government bots is significantly lower compared to the pages of TASS and Vedomosti. This confirms researchers' earlier observation that bots prefer to operate in a comfortable environment where the proportion of those loyal to the official viewpoint is inherently higher (→ Re:Russia: Putin Fans or Kremlin Bots). Their task is not to persuade convinced dissenters but to mobilise the 'swamp', reinforcing the perception of the feeling of an 'imaginary majority', to which, as a result, the 'swamp' tends to align itself.

Share of pro-government bots in the total number of comments, 2023

The majority of comments from Kremlin trolls are left in around 45 popular news communities. However, over time, their contribution has diminished as Kremlin trolls expand their influence beyond their traditional habitats. If at the beginning of the year, the share of messages in these communities accounted for around 95% of bot comments, throughout the year, it gradually decreased to around 73% by the year's end.

The expansion of a 'zone of influence' of Kremlin trolls is evidenced by the increasing number of groups where they intensively post: over the year, the number of groups where bots wrote more than 100 comments per week increased from 40 to 95. Given that the total number of bot comments did not increase, this should be interpreted as an expansion of activity rather than an increase in intensity. Expansion primarily occurs through bots infiltrating regional communities such as 'Overheard in Barnaul', 'Region-75 | Chita', and 'Emergency Situations in Novosibirsk', which aligns with the strategy of the main Kremlin contractor in the field of network propaganda, the Autonomous Non-profit Organization 'Dialogue', which is oriented towards regional agendas and communities (→ Grigory Asmolov: Propaganda in the Network Environment).

The total number of subscribers in groups where bots wrote more than 100 comments per week by the end of 2023 was 62 million, which is almost double the potential audience compared to the beginning of January 2023 when this number was 33 million. However, it is important to note that not all users thoroughly read comments, audiences of different communities may heavily overlap, and there may be significant subscriber inflation in their metrics.

Number of groups where Kremlin trolls leave more than 100 comments per week, 2023

Bot demographics: Young women as militarists

As our statistical observations show, bot farms deliberately choose the age and gender of bot accounts. The median age of 'evil' Kremlin trolls is 28 years old, and the mean age is 30. This is noticeably lower than the average age of users on VKontakte. For real users, these figures stand at 39 and 42 years old, respectively. There are almost no accounts with specified ages under 18 among the Kremlin trolls. The most common age range specified is from 24 to 30 years old, with a significant portion of accounts claiming to be aged between 31 and 38 years old. Accounts with ages over 38 years old are much fewer in number.

Age distribution of regular users and 'evil' Kremlin trolls, %

In terms of gender, the majority of 'evil' Kremlin trolls identify themselves as female, constituting 66% of such accounts. Consequently, a quarter of all bot accounts present themselves as women aged 24 to 30.

The paradox here is that younger demographics (those under 30) are much less supportive of the war compared to those over 40, especially those over 45. In terms of gender, women are much less likely to support the war than men. Therefore, encountering a real young woman in Russia with militaristic views is not so easy as, in the offline world, this demographic is the most anti-war. However, online bots strive to 'correct' this gap, thus obscuring real user preferences.

As this overview has shown, bots are deeply embedded in the fabric of social networks and constitute a distinct and noticeable part of their population, indistinguishable to regular users but significantly altering the network's 'climate of opinions' (another term coined by Elisabeth Noelle-Neumann, the author of the concept of the 'spiral of silence'). The effectiveness of bots is enhanced by their cohesion and aggression, which forces real users to retreat, yielding even more of the online space to bots. According to observations by Grigory Asmolov, the sharply increasing activity of bots may have played an important role during moments of 'crisis propaganda', when unforeseen dramatic events (such as the Prigozhin revolt or the terrorist attack at 'Crocus City') created risks of 'rupturing' the propaganda narrative. They also function to prevent the consolidation of discontent in regional communities as a result of various negative incidents. The strategies employed by bots during these events warrant a separate detailed analysis.