Heavy police presence at Place de la Republique, November 17, 2015. Demotix/ Michael Debets. All rights reserved.The Friday 13 events in Paris were horrendous – bloody and heartless attacks without warning on innocent civilians enjoying a warm November evening, in restaurants and bars, at a concert and a soccer game. A few days before, I had been walking through neighbouring districts, taking in the sights and sounds of a great city. Unfortunately, French responses thus far follow a familiar pattern… So border checks were increased with ramped-up databases and reinforced surveillance.
Anyone who knows about security-and-surveillance could guess what would happen. Security authorities would want more powers and governments would gauge how far they could go. Of course, it’s understandable that any government in that position will wish to reassure the population with visible signs of security. But governments do tend to rush headlong into new counter-terrorism measures after a major attack like that in Paris. Sometimes they are desperate to show that they are doing something.
Events like these are also viewed as ideal opportunities to make policy changes. As Naomi Klein shows in Shock Doctrine, some governments have for many years exploited crises to push through controversial law or policy. A recent example is shootings on Parliament Hill in Canada’s capital city, Ottawa, in October 2014. Although motivations were mixed they were unequivocally labeled as terrorist by the Harper government and were followed swiftly by a new bill, C-51, that became law in June 2015. The law expanded information-sharing and the mandate of the Canadian Security Intelligence Service.
Learning from the past
While tightened security may seem like a good plan, changing the rules and demanding greater powers for security and intelligence services in the wake of attacks may not be wise. Sober judgment, not knee-jerk responses, is called for. After the Charlie Hebdo attacks earlier in 2015, the French introduced new laws for warrantless searches, ISPs to collect communications metadata, and trimmed oversight for agencies. This time – in November – some borders were temporarily closed and a 3-month state of emergency was declared. The evidence that what Snowden and others call “mass surveillance” produces better ways of tracking terrorists is hard to find.
We saw it, classically, in the US after 9/11, but also in the UK after 7/7, where detention-without-charge, and a new offence of glorifying terrorist acts appeared in the 2006 Terrorism Act. Similar events have occurred in several other countries. Unfortunately, French responses thus far follow a familiar pattern. Given its location, these reactions are also European ones. The porous internal borders under the Schengen agreement mean that the external European frontier is seen as vulnerable. So border checks were increased with ramped-up databases and reinforced surveillance.
Boosting intelligence-sharing was another promise, related to the threat of ISIS and to activities such as terrorist financing. At the same time, security was enhanced in other world cities such as New York, in the wake of attacks. Some of this makes sense, but how much? Surely, as a recent CEPS report notes, a more measured approach would be to ask what has worked in the past, not just what has been done before?
Does mass surveillance work?
Does allowing intelligence agencies to collect more data – what Snowden and others call “mass surveillance” – or increasing powers of arrest and detention, and removing checks and balances, really improve things? The evidence that these produce better ways of tracking terrorists is hard to find. More is not necessarily better when it comes to data. In 2013, the White House claimed that more than 50 terror plots had been uncovered by the NSA’s methods. But in 2014 two independent reviews showed that in 255 terrorism cases investigated by the NSA, only four were the result of using phone metadata and none of the four prevented attacks.
Edward Snowden is clear about this. Reflecting on the Charlie Hebdo attacks in Paris, he pointed out that these occurred despite the mass surveillance programs introduced there in 2013. Observing that French law was now one of the most intrusive and expansive laws in Europe, he commented that it still didn’t stop the attack. They’re simply “burying people under too much data,” he said. By and large, conventional police work, based on targeted surveillance of suspects, is what produces results.
After the November attacks in Paris, American agencies were quick to blame Snowden – along with internet encryption and Silicon Valley privacy policies. CIA director John Brennan lashed out, saying that the surveillance leaks had undermined security – this despite the great care taken by Snowden not to copy any documents that might carry such risks. The New York Times argued that in fact Paris was a case of failure to act on information the authorities already had. As for encryption, the Paris attackers used none.
Police work and targeted surveillance
If one reads, now over many years, the reports of terror cells being busted or of potential terrorists being apprehended, there is almost no mention of big data or intelligence-sharing. Many criminologists insist that, by and large, conventional police work, based on targeted surveillance of suspects, is what produces results. This is why carefully considered responses to terrorist tragedies are so badly needed.
As criminologist Peter Fussey reminds us, in some notorious cases, members of the public stepped in to avert tragedies – such as the flight AA93 shoe-bomber, Times Square and the Thalys train attack at Pas-de-Calais. As he says, community tip-offs, informants, and police doggedly following known suspects is much more likely to produce results than expanded police powers and reduced oversight. In case after case, we have seen that intelligence agencies knew about those who committed atrocities but failed to – as they say – connect the dots. In case after case, we have seen that intelligence agencies knew about those who committed atrocities but failed to – as they say – connect the dots.
Actually studying terrorist attacks that have been thwarted, comparing them with those that succeeded – as Erik Dahl has done – leads to the discovery that what works is “specific, tactical-level intelligence – of the sort typically gathered through on-the-ground domestic intelligence collection and surveillance.” Not only this. Dahl argues that “connecting the dots” may not be the ideal solution. His careful examination of thwarted attacks convinced him that “intelligence analysis alone – especially the big-picture, long-range analysis so often prized – will not prevent surprise attacks and head off intelligence failure.”
This means that less data collection may actually produce better results – because then the authorities will be able to focus on analyzing what data they already possess. High tech solutions to terrorism may sound plausible but in reality they often compound the problem. Edward Snowden, a surveillance operative with a high level security clearance, saw this for himself at the NSA. “The reason these [Charlie Hebdo] attacks happened,” he said, was that “we had too much surveillance. We didn’t prioritize because we’d wasted too many resources watching people who didn’t present a threat.”
At the same time, indiscriminate surveillance creates new risks; innocent “suspects,” the chilling effects of everyone being tracked and checked and the denial of democracy – which ironically is a victory for terrorists. Terrorism arises, it seems, from groups that despise diversity and who seek national, political or religious homogeneity. The parallel suicide bomb attacks in Beirut were primarily directed at Shiite Hezbollah, that ISIS does not regard as true Muslims. It makes no sense if liberal democracies, that pride themselves on seeking fairness within diversity and on guaranteeing the rights of minorities and their most disadvantaged members, end up marginalizing dissent.
Security-driven surveillance today is very enamoured of big data ‘solutions’ – seen especially in the application of new analytics to seeking out suspects. While there may well be appropriate ways of using the so-called ‘data deluge’ created by internet and particularly social media use, the current trend is towards prediction and preemption. It’s when big data surveillance uses new modes of analytics not just to predict who might have violent proclivities, but also, as Ian Kerr and Jessica Earle warn, to take preemptive action, thus limiting their range of options – say, by placing them on a no-fly list – that privacy, due process and the presumption of innocence loom large as potential problems.
And it goes without saying that the same process that sucks up ‘suspects’ may well ingest not only those whose profile seems to suggest a wish to attack physically, but also those raising questions about government activities – including non-violent peace protesters or environmental groups. Terrorists accept risk and thus precautionary measures are also used, that make more people eligible for scrutiny. Thus it is not only practically important that methods of surveillance be found that do not have a high risk of impugning the innocent, it is also symbolically significant.
Fears and futures
After 9/11, I argued that one of the worst outcomes of the various responses to terrorism is the fomenting of fear. Without for a moment discounting the appalling suffering and loss associated with the Paris attacks – or any others – it must be said that some responses to such atrocities are also highly dangerous. At the far end of fear-mongering is the proposal from US presidential contender Donald Trump to establish a database of American Muslims. If he were not so popular this could be discounted as fascist fanaticism.
But the trouble with many surveillance responses is that they do so well what marks surveillance today – a process of social sorting that classifies populations in order to treat different groups differently. Thus what is done requires utmost care. Categories have consequences. Security agencies and forces have to discern between different levels of risk but there are ways of doing this that do not automatically prejudice some populations.
When security agencies make their case for more data, more sophisticated analytics, they often make it sound as if these were neutral technologies. They hate their “bulk collection” being labelled “mass surveillance.” Neither term is adequate, but what these terms tell us is that the technologies cannot somehow be ripped out of their social-political milieu. Their names express particular positions, so the challenge is to acknowledge the differing views and subject them to assessment in terms of the goals of surveillance.
If fears are generated by terrorism then, in a reflex action, fears are further fomented by shortsighted and insensitive anti-terrorism. If we don’t want a future fractured by fear then avoiding such inappropriate actions would be a good start. And reading dire dystopian treatments of surveillance futures may not help either. While Snowden often quotes Orwell, his activities promote a different approach from that of the author of Nineteen Eighty-Four.
Making a difference
Snowden insists – and proves it by his own example – that any and all can help to make a difference. These are not problems that can be solved overnight by some hastily concocted laws or a furious rush to foreclose freedoms. Indeed, these exacerbate our situation. Surveillance today touches us all and we all need to take action, however small, to change things for the better. As Jacob Appelbaum says, we need more democracy, not less.
If we’re to make a difference, three things stand out for me: Let’s remember what we really want – the common good, human flourishing. We can complain about what we don’t want or even worry about dystopian scenarios but it’s the positive vision that unites. Integrated and harmonious communities have consistently been shown to foster good security. Second, let’s support coordinated political action, as a new European report proposes, locally, nationally and internationally, on what are appropriate surveillance measures. Third, let’s expand digital citizenship, making positive and creative use of the very resources that some surveillance agencies (mis)use to fuel fear. As Engin Isin and Evelyn Ruppert point out, this means both claiming digital rights for all and acting in ways that promote digital democracy.
There is an acute and growing tension between the concern for safety and the protection of our freedoms. How do we handle this? Read more from the World Forum for Democracy partnership.