GlebStock/Shuttershock. All rights reserved
In November 2011, the Illinois Statewide Terrorism and Intelligence Center investigated the failure of a water pump in the water supply system. Apparently someone had entered and manipulated the control system. After further investigating the case, the analysts blamed an unknown group of Russian hackers for sabotaging the system. After the Center’s report appeared in the media, a fierce debate started about the implications of the supposedly first successful cyber sabotage of US infrastructures. However a week after this debate took off, the Department of Homeland Security (DHS) declared the analysis fundamentally wrong and highlighted the lack of any supportive evidence.
How did the Center arrive at such erroneous conclusions? Apparently it was all based on a single piece of dubious evidence. Five months before the ‘attacks’, someone with a Russian IP address had indeed accessed the facilities’ SCADA-systems. This someone was Jim Minlitz, whose company had installed and serviced the SCADA-system, and who had been on vacation in Russia in June 2011. From there he logged into the system to check some data. In fact the name of Jim Minlitz could have been found in the log data right next to the Russian IP address he was using.
This, however, was unnoticed by the analysts in Illinois who simply did not think of the possibility that the account owner had actually travelled to Russia. As a consequence nobody asked Minlitz about the events. “I could have straightened it up with just one phone call, and this would all have been defused,” he said. Such a phone call never happened. The analysts, one is led to conclude, were already in a state of premature cognitive closure.
Social cognition and international relations
Cognitive theories of social behavior build on our knowledge about strict limitations of human information-processing capacities. In order to deal with the scarcity of information-processing capacities, human cognition substitutes complex and comprehensive problem-solving strategies, i.e. those assumed by rational-choice theories, for much simpler heuristics. Additionally, it relies on mental structures which are based on past experiences and reside in the longtime memory of the human brain. Human cognition is, therefore, much more concept-driven than data-driven. Such processes are very efficient inasmuch as they minimize the use of cognitive capacities and the time it takes to comprehend social situations.
Preexistent cognitive schemes may also fundamentally fail to capture essential aspects of external reality. Due to the concept-dependency of human cognition individuals tend to see what they have expected to see.  Information which does not fit into preexisting schemes is ignored, devalued or misinterpreted. It is for these reasons that a number of different types of misperceptions are said to happen on a regular basis. As a result, unwanted conflict escalation is a real possibility, particularly when there are already systematic incentives for mistrust and worst-case thinking. Such circumstances can often be found in international politics, an argument that has been made by Robert Jervis and others.
Is cyberspace prone to misperceptions?
This does not, of course, prove that interactions in cyberspace suffer from the same tendency. Some might in fact assume the opposite: Because of the technical nature of the issue area, decision-makers in many cases rely on experts in the process of defining problems, finding solutions and implementing decisions. The consultation of experts arguably reduces the probability of misperceptions. Experts may also take part in the activities of transnational networks and they may build epistemic communities that can help to diffuse knowledge to decision-makers in different countries. Doing so fosters the creation of shared knowledge and decreases the probability of misperceptions which, in turn, diminishes the explanatory potential of cognitive approaches.
That being said, experts provide no guarantee against misperceptions due to several factors. For one, the assumption that the judgment of experts is less prone to cognitive biases is challenged by some cognitive scientists. Another factor is the literature on epistemic communities identifying a range of boundary conditions beyond which experts, whether they are organized at the national or transnational level, are unlikely to have much influence on political decision-making. For example, epistemic communities will not wield any lasting influence if they do not manage to get institutionalized access to decision-making bodies and bureaucracies. Another challenge relates to the coherence of individual epistemic communities and to the relationships between different epistemic communities in the same policy field. A decision-maker that is confronted with many contradictory expert judgments ultimately has to rely on his own knowledge for choosing which path to go down.
Expert knowledge is therefore no guarantee against misperceptions. What is more, a number of factors seem to increase the risk of misperceptions and inadvertent escalation in the case of cyber conflicts. First, actions in cyberspace leave ample room for speculations as to underlying intentions and purposes. Social activities such as war, espionage, and crime, which used to have distinct observable implications in the physical realm, are almost impossible to analytically disentangle in cyberspace. Computers serve very different purposes but in a very similar manner. As a consequence hacking into a computer may be part of an intelligence operation, but it could just as well be the work of organized crime or a foreign military service. Thus intelligence operations, if they are discovered, can be misunderstood as industrial sabotage or even war preparations.
Second, there is the problem of attribution and the leveling of power asymmetries. As a consequence there is a wide range of actors that might be responsible for most types of hacking attack. With this proliferation of possible perpetrators of cyber attacks the risk of misattribution equally increases. The many technical errors and malfunctions in complex IT systems also contribute to this tendency. A case in point is a series of power outages in several Brazilian cities in 2005 and again in 2007 which media reports erroneously explained as a consequence of successful computer network attacks. Third parties may hope to exploit such tendencies by starting ‘false flag operations’ in cyberspace with the intention of provoking a cyber conflict that is suitable to their interests.
A third reason simply relates to the ‘newness’ of the issue area. Our knowledge about cause-effect relationships and the dynamics of cyber conflict is still very limited. This is not to say that uncertainty always increases the risks of misperception. It may even work in the opposite direction by encouraging people to very cautiously calculate the pros and cons of diverse options or to postpone a decision. When the stakes are particularly high, such as in a severe diplomatic crisis, inaction might result in as much damage or even more damage than any unfounded decision. Under such circumstances cognitive factors may have a decisive influence on the judgments of decision-makers. If nothing else, thinking along the lines of preexisting cognitive schemes provides emotional comfort by offering a way out of all the anxiety and confusion that defines a crisis atmosphere. In extreme cases this leads to “premature cognitive closure”, a mental mode in which it is impossible to learn new insights and to reevaluate one’s perspective in light of disconfirming evidence.
What can be done about it?
Unfortunately, classical arms control cannot efficiently cope with the escalatory dynamics of cyber conflict. For one, every attempt to categorize technical capacities as cyber weapons is fruitless. Cyber attacks do not rely on firepower but on knowledge about computer code and organizational routines. Such knowledge cannot be prohibited and/or controlled. Additionally, the attribution problem in cyberspace works against verification requirements. Against this backdrop, arms control initiatives should primarily aim towards international norm-building as well as confidence-building measures (CBMs).
In fact there has been progress particularly in the latter area recently. State representatives both at the UN and the OSCE agreed last year that miscalculation and inadvertent escalation needs to be avoided by greater transparency about policies, organizations and doctrines and improved crisis communication. On a bilateral level, Russia and the United States established contact points between their respective Computer Emergency Response Teams (CERT). They also exchanged White Papers and created a secure line between the White House and the Kremlin in case of major cyber attacks. With China a bilateral working group has been established. There is also ongoing track-II diplomacy (“Sino-U.S. Cybersecurity Dialogue“) organized by the Center for Strategic and International Studies (CSIS) and the China Institute for Contemporary International Relations (CICIR). Up until today eight formal and several informal meetings took place, including three simulations of major cyber crises.
These and similar efforts need to be intensified. One pertinent issue besides transparency measures and communication channels will be to support a common understanding of basic terms such as cyber security, cyber attack and critical infrastructures. States should also try to reach agreement about what constitutes an armed attack as defined by international law. Such an agreement about ‘red lines’ will help to ease the risk of miscalculation and inadvertent war.
 Robert Jervis (1976), Perception and Misperception in International Politics, Princeton, NJ: Princeton University Press, p. 187.
 Susan T. Fiske and Shelley E. Taylor (2010), Social Cognition: From Brains to Culture, Boston et al.: McGraw Hill, pp. 164-195.
 Philip E. Tetlock and Charles McGuire (1986), Cognitive Perspectives on Foreign Policy, in Samuel Long, ed., Political Behavior Annual, Volume I, London: Westview Press, p. 150.
 ibid., p. 159.
 Fiske and Taylor, p. 178
 ibid., pp. 216-219; Janice Gross-Stein (2002), Psychological Explanations of International Conflict, in: Walter Carlsnaes, Thomas Risse and Beth A. Simmons, eds., Handbook of International Relations, London et al: Sage Publications, p. 193.
 Fiske and Taylor, pp. 154-163.
 Jervis, Perception and Misperception; Gross-Stein, Psychological Explanations of International Conflict.
 M. Haas , Introduction: Epistemic Communities and International Policy Coordination, in: International Organization 46 (1), pp. 1-35.
 Janice Gross-Stein and David A. Welch (1997), Rational and Psychological Approaches to the Study of International Conflict, in: Nehemia Geva and Alex Mintz, eds., Decision-making on War and Peace: The Cognitive-Rational Debate, Boulder: Lynne Rienner, p. 56.
 Haas, p. 11.
 William A. et al. (2009), Technology, Policy, Law, and Ethics Regarding U.S. Acquisition and Use of Cyberattack Capabilities, Washington, DC: The National Academies Press, pp. 315-317.
 Jervis, p. 187.
 James Andrew Lewis (2011), Confidence-Building and International Agreement in Cybersecurity, in: Disarmament, 4/2011, p. 58.
 United Nations General Assembly (2013), Report of the Group of Governmental Experts on Developments in the Field of Information and Telecommunications in the Context of International Security, A/68/69, 6-24-2013; OSCE (2013), Initial Set of OSCE Confidence-Building Measures, PC.DEC/1106, 3-12-2013.