digitaLiberties: Opinion

Five reasons to worry about ‘spy tech’ firm Palantir handling your NHS data

Since the British government won't tell us, we can only guess what they’re up to…

Matthew Linares Adam Bychawski
4 March 2021, 11.33am
The UK government’s murky deal with tech giant Palantir is a terrible misstep
|
RayArt Graphics / Alamy Stock Photo

Never has public trust in health authorities been so important yet so fragile. In the UK, the NHS is carrying out the biggest vaccination drive in its history to inoculate millions from COVID-19. However, decades of marginalisation have left Black, Asian and migrant communities more reluctant to be vaccinated.  

We’ve seen how quickly a lack of transparency can lead to people turning their backs on public health initiatives in the past. When, in 2014, the NHS proposed expanding the data it collects on patients for research purposes, the project quickly soured because of opaque communication and failures to gain consent. 

In this context, the UK government’s murky deal with tech giant Palantir is a terrible misstep. With faith in Silicon Valley leaders at an all-time low, and a public more wary than ever of their data privacy, ministers have chosen to extend a data-sharing deal with a beast of online surveillance under the least scrutiny possible.

openDemocracy and Foxglove are suing the British government over its failure to consult or carry out a proper risk assessment on the Palantir deal. 

Help us uncover the truth about Covid-19

The Covid-19 public inquiry is a historic chance to find out what really happened.

Right now, we don’t actually know what data sources are being fed into the Palantir project – the government has refused to tell us: redacting all the details. But Palantir’s track record gives us plenty of things to worry about. Here are just some of them.

Patient privacy breaches

Staff surveillance.
Staff surveillance.
|
Håkan Dahlström, CC BY 2.0. Some rights reserved.

At the bank JP Morgan, an investigations team worked with Palantir to find internal malpractice. Palantir’s tools inspired the team to collect as much data as possible on staff. A staffer wielding those tools lamented that: “The world changed when it became clear everyone could be targeted using Palantir... everyone’s a suspect, so we monitored everything. It was a pretty terrible feeling.”

While we have been assured that all the data being fed into the datastore is anonymised, even supposedly anonymised data can become linked amongst wider data pools. This is part of what makes Palantir’s products powerful.

Entrenched racial bias

Feedback loops in policing.
Feedback loops in policing.
|
Chris Yarzab, CC BY 2.0. Some rights reserved.

Palantir’s software has been used by multiple US police departments. It works by collecting and linking millions of digital records to create a searchable database. That data is then run through an algorithm that can flag possible criminal suspects or identify crime hotspots. 

However, researchers have raised concerns about potential flaws in the technology that could lead to people being falsely suspected of crimes. In Los Angeles, each time an individual was stopped by the police they were added to Palantir’s system and given a ‘point’ on their digital record. 

Such a system risks creating a feedback loop: people in over-policed neighbourhoods are “more likely to be stopped, thus increasing their point value, justifying their increased surveillance, and making it more likely that they will be stopped again in the future”, said Sarah Brayne, the author of ‘Predict and Surveil: Data, Discretion, and the Future of Policing’, in The Intercept.

Feedback loops like those discovered by researchers in Palantir’s US policing software could lead to pre-existing inequalities being reinforced in health systems. For example, Black people are more than four times more likely to be detained under the Mental Health Act in the UK and more than ten times more likely to be subject to a community treatment order.

Might an algorithm further increase the likelihood of inappropriate health interventions?

Targeting of migrant adults – and children

Palantir’s software was described as “mission critical” to US border enforcement.
Palantir’s software was described as “mission critical” to US border enforcement.
|
Alex Milan Tracy/SIPA USA/PA Images

Palantir has attracted controversy for its involvement with the US’s Immigration and Customs Enforcement (ICE) agency, which is responsible for the removal of undocumented immigrants. The Trump administration tasked the ICE with increasing deportations, leading to a spike in the number of people being held in deportation centres. 

The agency has been repeatedly accused of racial profiling and violations of human rights. Palantir has provided technology for ICE’s systems that was used to “execute harmful policies targeting migrants and asylum-seekers”, according to Amnesty International

The human rights group found that the tech company’s software was used to plan the arrest of hundreds of people and the separation of children from their parents. 

Amnesty concluded that Palantir failed “to put an adequate due diligence process in place” and “to meet its responsibility to respect human rights”.

This damning verdict on the tech company’s ethical standards raises questions about its suitability to handle sensitive NHS health data. 

The British government’s ‘hostile environment’ approach to immigration has already resulted in a number of people, including refugee and asylum seekers, being denied urgent care, often in breach of eligibility rules

Mission creep

In more than one case, Palantir projects have been shown to pry beyond the scope of their contract or public oversight.

As above, when JP Morgan employed Palantir, the deal ended badly when the bank’s own bosses discovered the firm had gone so far as to spy on the bosses themselves.

The firm was again embroiled in overreach when the New Orleans Police Department took to using Palantir software for invasive police work, without the knowledge of city councillors responsible for the jurisdiction.

No public debate

Even if the UK government publishes the NHS contracts without critical redactions, and carries out full risk assessments, we should not assume that this will reveal the full extent of Palantir’s activity.

As citizens, we shouldn’t have to guess how our data might be used by the government and its partners. We deserve not only to know – but to be consulted before big deals like this get struck, not afterwards. 


We’re bringing an urgent legal challenge: demanding public consultation on this massive deal. To do this we need to raise funds to cover our cost risk. Can you back us?

Had enough of ‘alternative facts’? openDemocracy is different Join the conversation: get our weekly email

Comments

We encourage anyone to comment, please consult the oD commenting guidelines if you have any questions.
Audio available Bookmark Check Language Close Comments Download Facebook Link Email Newsletter Newsletter Play Print Share Twitter Youtube Search Instagram WhatsApp yourData