openDemocracyUK

We need a strong privacy theory: a response to Stephen Taylor

Michael Birnhack responds to Stephen Taylor's notes towards a theory of privacy. He agrees that we desperately need a theory of privacy, but that this should extend beyond the protection of ourselves against unwanted access.
Michael Birnhack
13 February 2011

Privacy allows us to be a hypocrite, and this is a good thing too. That is the intriguing argument laid down by Stephen Taylor in his notes towards a theory of privacy. He emphasises the importance of the home as the quintessential space for freedom of thought—“the privacy of our own skulls”, as he puts it. He further argues that corporations and the state require some minimum level of privacy to function. I take issue with several of his claims. 

It’s certain that we desperately need a theory of privacy. The accelerated rush of new technologies into our lives, increasingly complicated business models and new governmental schemes, are together enabling the collection of more data, more accurately, covering all aspects of our lives – not only information about what we are doing, but concerning our very identity, our whereabouts, our bodies and our emotions. In some cases we are assured that it is all for the best; in others we are told nothing at all. Surveillance is omnipresent and the indications for the future are that the new technologies will do much more. These are the first findings of PRACTIS, a European research group.

Explanatory theories of privacy are available, for instance American privacy scholar Daniel Solove's privacy taxonomy, or the philosopher Helen Nissenbaum's sociological account of Privacy in Context. But making sense of the reality is only the first step. What we increasingly need is a stronger privacy theory that also has a normative element. Today, the threat to privacy is not only posed by the government or private corporations. The real danger lies in the collaboration of these two, which my colleague Niva Elkin-Koren and I have called "The Invisible Handshake". While another threat to privacy is the press, we are also a danger to ourselves, along with our network of online friends as we tell and tag, share and publish, link and "like" our personal data.

Despite the requiems, privacy is not (yet) dead. We can take action, by designing privacy into new technologies, better regulating the processing of data and joining campaigns against new policies, such as NO2ID, or consumers' protests against particular business models (for example Gmail users protesting against Google Buzz or the many Facebook groups that challenged the social networking site when it changed its privacy policy). We can also raise awareness and improve education – but for all of these actions, we need a robust theory.

Philosophers, sociologists and lawyers have offered a raft of justifications for privacy. I like to think of them as concentric circles: the autonomous person is at the centre, with personal and professional relationships, the community and society at large in the outer circles. Framing the theories as concentric circles enables us to see that they are supportive of each other rather than mutually exclusive. Privacy supports individuals and it supports the community (despite some communitarian arguments that the former conflicts with the latter). Taylor's theory of privacy as hypocrisy focuses mostly on the community's circle and on the person's circle. As for the former, he believes that privacy as hypocrisy enables civility. For the latter, he thinks that it enables the protection of "the private space that ideology seeks to colonise", as he beautifully put it.

But there is more. We need privacy not only as a protection against unwanted access to ourselves. We need it to assure our own control of ourselves. We should be able to decide when, what, and how to provide personal information to others. Once our ability to control ourselves in this way is reduced, our human dignity as autonomous agents is violated. For this reason, privacy should not be limited to the inside of the skull. It should also cover our movements, bodies and activities. RFID tags, CCTVs, mobile phone tracking technologies, Plate Recognition, GPS and much more already compromise our location. DNA, iris scans, facial recognition and airport body scanners compromise our bodies' privacy. Every website that uses a cookie compromises our privacy as to what we are interested in, what we read, what we buy, and eventually, who we are.

A strong theory of privacy should base itself on the combination of the concentric circles. It should set out what we should have, but also who should have it. Unlike Taylor, I do not think governments and corporations should enjoy privacy. To be sure, their activities should be protected by the law: it should be illegal to break into their offices, hack their communications or infringe their trade secrets, for example. But they do not need a general layer of privacy. This is a human right, based on justifications that centre on the individual moral agent.

A strong privacy theory does not mean that all other interests are dismissed. It is often the case that there is a rival interest, be it national security, law enforcement, or simply efficiency and convenience, that justifiably overcomes that of privacy. But a privacy theory can help us establish the minimum necessary violation. And if we do choose to accept this violation, it can help us acknowledge the value of what we are sacrificing.

Michael Birnhack is Visiting Fellow at the Institute of Advanced Legal Studies, University of London, and Professor of Law at Tel-Aviv University.

Stop the secrecy: Publish the NHS COVID data deals


To: Matt Hancock, Secretary of State for Health and Social Care

We’re calling on you to immediately release details of the secret NHS data deals struck with private companies, to deliver the NHS COVID-19 datastore.

We, the public, deserve to know exactly how our personal information has been traded in this ‘unprecedented’ deal with US tech giants like Google, and firms linked to Donald Trump (Palantir) and Vote Leave (Faculty AI).

The COVID-19 datastore will hold private, personal information about every single one of us who relies on the NHS. We don’t want our personal data falling into the wrong hands.

And we don’t want private companies – many with poor reputations for protecting privacy – using it for their own commercial purposes, or to undermine the NHS.

The datastore could be an important tool in tackling the pandemic. But for it to be a success, the public has to be able to trust it.

Today, we urgently call on you to publish all the data-sharing agreements, data-impact assessments, and details of how the private companies stand to profit from their involvement.

The NHS is a precious public institution. Any involvement from private companies should be open to public scrutiny and debate. We need more transparency during this pandemic – not less.


By adding my name to this campaign, I authorise openDemocracy and Foxglove to keep me updated about their important work.

Who is bankrolling Britain's democracy? Which groups shape the stories we see in the press; which voices are silenced, and why? Sign up here to find out.

Comments

We encourage anyone to comment, please consult the oD commenting guidelines if you have any questions.
Audio available Bookmark Check Language Close Comments Download Facebook Link Email Newsletter Newsletter Play Print Share Twitter Youtube Search Instagram WhatsApp yourData