In Russia, opinion polls are a political weapon
Russia’s state-controlled pollsters manipulate surveys to provide the results the regime wants
Does the Russian public support Vladimir Putin’s decision to start a full-scale military invasion of Ukraine?
The situation has been unfolding fast, and so far only a few pollsters and agencies have been able to collect and present some fragmentary data. The results reflect how problematic, both methodologically and politically, the use of surveys in authoritarian states – let alone countries at war – can be.
The numbers thus vary significantly. According to recent results from state-controlled pollsters WCIOM, 71% of respondents supported Russia’s “special military operation” in a 3 March poll. The recent results from another state-controlled pollster, FOM, showed that 65% of respondents supported the “launch of Russia’s special military operation” in a 25-27 February survey. A private survey agency, Russian Field, reported that 58.8% of respondents supported “Russian military action in Ukraine” in polls conducted from 26 to 28 February. A poll from mid-February, before the invasion, commissioned by CNN and conducted by a British agency, Savanta ComRes, reported that 50% of respondents would support Russia’s use of force to prevent Ukraine from joining NATO, and 36% would support Russia’s use of force to “reunite” Russia and Ukraine.
While these results may be contradictory, it is important to note that Russian opinion polls are immediately instrumentalised by the Kremlin, repeated by the Russian media, and used to claim that the invasion is supported by the Russian public and conducted in its name.
Get one whole story, direct to your inbox every weekday.
Public opinion in Russia
There are a few important points that need to be taken into account when discussing public opinion in an authoritarian country which is at war with another country.
First, public opinion itself, unlike how the media presents it, is never a monolithic and solid entity. In public opinion surveys, people are contacted randomly to ensure that a small sample represents a nation’s opinion on an issue.
As a result, those who respond to survey questions are very different people. While a small minority of respondents have well-thought-out opinions, there are also many others: those who have an opinion that does not fit within a rigid survey structure; those who have a vague feeling, but would not be able to articulate it without being asked; those who do not know what is going on at all, but feel the need to come up with some response just because they are being asked. All these responses are then translated into numbers which are claimed to represent the opinion of the nation.
Second, public opinion in wartime is problematic. People and public discourse are shaped by strong emotions, and are polarised and divided. Many people respond to questions in ways they would not under normal circumstances. Those who have no firm opinions are forced to take positions and categorically approve or disapprove of very drastic measures.
Finally, public opinion surveys in authoritarian countries are even more problematic. In autocracies, people might want to hide their opinions and give socially desirable answers that conform to the official government position for fear of facing repression or deviating from the consensus view. They can also refuse to answer pollsters’ questions because they are afraid or perceive surveys as a tool of the government. As a result, the number of those who approve government policies in pollsters’ samples can be higher than in reality. At the same time, autocrats like inflated results. They will gladly distribute results that show broad support of the regime’s actions in order to discipline elites, demoralise opponents and further sway opinions in their favour.
The results should thus not be treated as absolute numbers. Opinions are not physical entities which can be expressed numerically. As absolute numbers, they can only be used as approximate reference points that suggest that there are significant groups of people supporting certain positions. These numbers should be treated as relative numbers, instead of claiming that X% of the population approves of the Russian government’s actions.
Propagandistic cliches and an inattentive public
The way that WCIOM, FOM and Russian Field conduct polls are similar – they all use random nationwide stratified samples. The difference between the results, somewhere around 7-13% (58% vs 65% vs 71%), can therefore be attributed to how pollsters formulated their questions.
When asking respondents what they thought of the war, WCIOM and FOM (state-controlled polling agencies) referred to the war as a “special military operation” or “military operation”, a propagandistic cliché and euphemism widely repeated across the regime-controlled media. This term is used to downplay the drastic nature of the invasion. Russian Field used the phrase “military action”.
There is solid evidence in public opinion research that suggests that people who are the least politically engaged and have no crystallised opinions are the most susceptible to changes in wording of survey questions. For these people, the phrase “special military operation” harks back to propagandistic clichés used on regime-controlled media and therefore guides their answers, while in fact they might have no strong opinion or no opinion at all on the issue.
My own research demonstrates the mechanism behind this process. When Russian television viewers rely on propagandistic clichés to make sense of the Russia-Ukraine conflict, they tend to support the government’s interpretation. This reliance on propagandistic cliches is even more effective when they are repeated across media. When TV viewers rely on their personal experience, they express much more critical reactions towards the Russian government’s actions, including its interference in Ukrainian politics.
This difference is a result of propaganda. But it also shows that a significant part of the Russian public neither approves, nor disapproves of the war: these people simply have no articulated opinion on this issue. While this constituency is registered by surveys, it is likely that it will not bear out in other contexts. For instance, it is likely that people with no articulated opinions will adopt the positions of others – whether regime critics or regime supporters – in conversations or when making political decisions. Crucially, Russian state-controlled pollsters often offer categorical responses to polling questions, such as “yes”, “no” or “I don’t know”. These options hide a significant portion of Russian citizens who are hesitant over the invasion of Ukraine and do not have strong opinions on the issue.
In contrast to WCIOM and FOM, Russian Field used five survey options. This survey reported core supporters and critics who “definitely approve” or “definitely disapprove” of the Russian invasion (37.6% and 23%). But it also reported a big group of less-certain respondents who “rather approve” or “rather disapprove” of the invasion (21.2% and 11%). These people are the most likely to be swayed back and forth by Russian television news and the choice of wording in surveys.
In autocracies, citizens are often afraid of answering pollsters’ questions in general, let alone questions about politics. This generates a distortion known as social desirability bias – citizens lie about their real preferences, which inflates survey results.
This inflation is well demonstrated by the WCIOM poll about Putin’s decision to recognise the independence of the so-called ‘People’s Republics’ in eastern Ukraine two days before the invasion. WCIOM reported that on 22 February, 73% of respondents supported recognition. Here, what is important is how WCIOM pollsters asked the question.
WCIOM’s question was formulated in the following way (original Russian grammar): “Tell us, please, the decision of the president to recognise independence of Donetsk and Luhansk People’s Republics do you support or do you not support?”
The most important part of the question, whether the respondent supports or not, is hidden behind a long preamble about Putin’s decision. The question is formulated in order to remind those people who disagree with this decision that they are against the Russian president, a risky position in an authoritarian country.
Russia is currently experiencing a much higher and more visible rate of repression, and this makes social desirability bias much more likely. The preamble in the WCIOM and FOM polling questions is likely to bias the results in favour of the regime.
‘Yes’ and ‘No’
In addition to social desirability bias, state-controlled pollsters manipulate questions in order to tap into people’s support for different issues.
Thus, support for ‘war’ is not the same as support for a “military operation”, and support for the “military operation” is not the same as support for Putin. By lumping together Putin and “military operation” into a single question, FOM and WCIOM skilfully replace the topic of war with the topic of Putin.
While support for war and Putin are likely to be closely connected, it is difficult to imagine that everyone who supports Putin supports the war. This question combines these categories of people together, thus inflating the result in the government’s favour.
One of the key issues that undermines the validity of opinion surveys is which people decide to participate in them. Research shows that politically active, informed and opinionated citizens are more likely to participate in surveys. In democratic contexts, this self-selection bias over-represents politically active and polarised people, but not supporters of a specific party.
This problem is much more acute in authoritarian states. In Russia, the government gives preferential treatment to its supporters, while its critics have reasons to be afraid to express their views. When citizens are afraid to express their political views or perceive interviewers as agents of the authorities, they may opt to refuse to participate in a survey at all.
Thus, it is a minority of citizens who are ready to participate in opinion surveys. Common sense and public opinion research literature suggest that this minority is likely to be more informed and opinionated, but in authoritarian contexts they are also likely to have stronger pro-regime attitudes. According to a 2020 Levada Center survey, regime supporters in Russia are almost twice likely (58%) to trust opinion surveys than regime critics (32%), which also suggests that regime critics are less likely to participate in surveys.
This self-selection bias can strongly affect a survey’s final results. Logically, if regime supporters are over-represented in a sample, their opinions make the final result look as if there are more people supporting the government’s actions.
These four problems are likely to have a compound effect. Imagine the following scenario: because people critical of the Russian regime are less likely to participate in an opinion survey, there are fewer regime critics in the polling sample. Among regime critics who choose to participate in a survey, there are some who respond that they do, in fact, support the government’s actions, and hide their real preferences. Finally, some of those respondents without a clear opinion on the war respond that they support the government’s actions because either the wording of questions or the overall topic of a survey replicates propaganda clichés on Russian state media.
The party of war
These four problems inflate survey results in the Russian regime’s favour, and produce the dramatic numbers we have seen above. Yet the fact that they are inflated does not mean that no one supports the war.
According to the Russian Field survey, 37.6% of respondents definitely approve of Russia’s invasion. Who are these people in the ‘party of war’? Researcher Mikhail Sokolov provides a useful analysis of WCIOM data using odds ratios. Unsurprisingly, age and consumption of television news are the two main factors that separate people who support the war and those who do not.
Among people older than 60, only 10.6% of respondents in the Russian Field survey disapprove of the war, compared to 83.3% who approve it. Among those who are younger than 30, 50.7% of respondents disapprove of the war compared to 37.5% who approve of it. Similarly, among television viewers overall, only 13.4% disapprove of the war, compared to 79.5% who approve. Among those who do not watch television, 52.3% do not approve of the war, while 36.4% approve.
As Sokolov puts it: “If you are younger than 30, live in a big city, have a higher education and do not watch television, the probability that you will not support the actions of the Russian army exceeds 80%.” We also know a lot about the political preferences of regime supporters from prior research. They tend to share Soviet nostalgia and see Putin as someone who successfully led the country out of the political and economic chaos of the 1990s, and have an emotional attachment to the regime.
Why do autocrats like surveys so much?
This spiral of under-representation and silence, which is strengthened by state propaganda and manipulative opinion surveys, is a blessing for autocrats. Overblown support demonstrated by surveys can be used as a tool to demonstrate broad support of the regime to the public and send a signal to the elite to prevent defection.
More importantly, surveys that inflate government support can be used to discipline citizens. Social psychologists demonstrate that people often use the responses of others to an issue as cues to form their own opinions. Previous research has demonstrated that Russian citizens’ attitudes towards the regime and its actions are, to a significant degree, driven by what they see as the prevailing consensus in society. Publicising the results of the surveys with inflated support for the government’s actions can further deepen the effect of social desirability bias by making regime critics feel that they are in the minority and by giving a pro-regime cue to those who are hesitant.
Autocrats understand this, and this is why Russian state media have broadcast the results of the recent state-controlled WCIOM and FOM’s surveys widely. These results give an incorrect picture of the real distribution of political preferences among Russian citizens, encourage people who have no articulated opinions to take the government’s position, and encourage regime critics to hide their preferences.
As sociologist Alexei Titkov argues: “Alexander III famously said that Russia had only two allies: the army and the navy.’ [Today] Vladimir Putin has only two allies: Russia’s missile forces on the one hand, and WCIOM and FOM on the other.”
Readers would be wise to bear in mind the following: in autocracies, opinion polls are a political weapon – and their results are far from representative.
Get our weekly email