We mustn’t let Silicon Valley thinking infect our NHS
Secretive COVID contracts show how big data firms are taking over our healthcare. What are they – and the British government – hoping to get out of it?
I’ve been asking the government questions about its COVID-19 data deals for weeks. But there was one particular question I thought they might be happy to answer: what are the success stories? Have we got anything from these contracts with tech giants like artificial intelligence firms Palantir and Faculty? After all, their skill in “delivering at pace in this time of crisis” was what had justified Dominic Cummings in summoning them to Downing Street in mid-March.
In response, I was pointed to the Faculty ‘dashboard’ – just at the point when the public health world was crying foul because this dashboard didn’t contain any data on community infections, so was missing outbreaks like the one in Leicester.
If the dashboard is the only thing we’re getting from the biggest transfer of our NHS data to the private sector in history, it’s pretty unimpressive. And it left me wondering, who else is getting what?
The NHS already has systems for storing information about us. This includes important, but highly sensitive details about our health, race, and genes. As you’d hope, they are carefully regulated.
And yet a parallel system is being built with the help of Faculty and Palantir, overseen by a new body: NHSX. Its head, Matthew Gould, formerly of the Foreign Office, has said that the project isn’t just about managing COVID-19, but about reshaping the NHS afterwards. For weeks, I’ve been asking the government what this means. With every non-answer, I’ve become more concerned.
I’m worried that what’s under way is a move to give private companies more power to decide who gets treatment and who doesn’t, and use algorithms as science-speak to bamboozle us into blaming ill-health on people without power, rather than those with it.
The government claims AI will help with the holy grail of healthcare: prevention. But we already know the factors which govern how likely you are to contract COVID-19: the behaviours of your landlord, your boss or your care home owner. Are Faculty and Palantir really being employed to reveal this basic truth? Or to obscure it?
This NHS AI work is a series of black boxes within black boxes. We don’t know how the companies were chosen. We don’t know the purpose of their work. We don’t know what algorithms they are using. We don’t know what data these are processing.
But here’s what we do know.
Last December the government agreed a contract to develop an NHS ‘AI Lab’ with Faculty, which had been involved in running the Vote Leave campaign. The work was never publicly advertised. The government refuses to tell us how many companies it invited to bid for it. And the Department of Health won’t give us any details about what the original contract was for, though a government blog suggested it would “transform” the NHS and “facilitate cross-government, industry and academic collaborations”. When COVID-19 arrived, this project was extended to cover it.
Faculty had no track record in the NHS beyond one project managing staff time in a breast-screening clinic in the East Midlands. It remains unclear why it was selected to “transform” the NHS through AI.
To see the contract, and a similar one with the CIA-linked surveillance firm Palantir, openDemocracy had to team up with legal campaigners Foxglove and threaten legal action against the government. I’ve been poring over the documents the government reluctantly released. Both are terrifyingly vague about what the firms are up to.
Meanwhile, Palantir’s role has been extended for four months, and its use by local managers has gone beyond an original focus on ventilators and oxygen to also look at routine care during the pandemic, the New Statesman reported last month. Concerns were also raised last month in the US about the parallel centralised reporting system the White House is now insistng COVID data to be uploaded to – and Palantir’s role in building it – and how this new system is bypassing local and specialist state decision-makers.
When we first looked at what data was being pumped into the companies’ systems, there was a list on the government website. This indicated they were acquiring highly sensitive and wide-ranging data: on race, imprisonment, diagnosis and treatment of non-COVID-19 patients. But since we started asking questions, this list has been edited to omit all these details. The government admits it is “not exhaustive”.
In Los Angeles, activists have accused a Palantir-built policing app of producing “racist feedback loops”. Whether the company is using race data in the NHS, and for what, is not a question we can afford to stop asking: allocation of healthcare resources determines who lives and who dies.
The British government’s earlier data list also seemed to contradict the Data Protection Impact Assessment (DPIA) about what was actually being uploaded, with the DPIA assuring us race data won’t be fed in. This legally required document also breezes through the risks of the project with next to no detail of how it will be audited and how breaches are to be prevented and responded to. Last month the government was forced to admit that in relation to a separate part of its COVID-19 response, the track and trace app, it had actually broken the law by not undertaking a DPIA at all.
Garbage in, garbage out
Whilst we don’t have much knowledge of what information is being fed into the data analytics industry, Michael Gove recently suggested the insights spewed out the other end should drive policy-making. Gove decried the work of civil servants trained in social sciences who “play safe”, arguing what was needed was more mathematicians given “room to progress, and if necessary, fail”.
Algorithms know us better than we know ourselves, Silicon Valley likes to tell us. Disoriented by a COVID-shocked world, the risk is we might start to believe such nonsense.
The truth is that what comes out of an algorithm depends on what a human chooses to put in. And we’re not getting to debate that.
Silicon Valley and its UK wannabes are interested in individual and aggregate behaviours so they can learn how to sell us things. They’re not in it to figure out how we can regulate the powerful forces making us sick. And research into these real matters of public health seems to have dropped off the government’s radar, along with any political will to tackle the inequalities that increase vulnerability to almost all serious illness, including respiratory viruses.
Efficiency vs ethics
Instead, the government has promised what is, in reality, another NHS reorganisation later this year. But it’s staying tight-lipped over what it will involve. We do know Number 10 will be represented on the taskforce overseeing the plans by Boris Johnson’s lead health advisor, William Warr, who – much like health secretary Matt Hancock – has made bold claims about the NHS being saved by the digital health and genomics industries. Number 10 is spinning that this reorganisation will just formalise what’s happening on the ground and reduce commercialisation.
But having looked closely at early drafts of these plans, and talking to experts who’ve been digging closely into what’s already happening, I’m worried that what’s really planned is exactly the opposite.
There are areas where more data would be useful. In the baffling labyrinth of our increasingly privatised health and social care system, someone does need to keep on top of what’s going on, from private hospitals to PPE providers to medical research. The new system does appear to be collecting some of this operational data, much of it from the private sector.
But many of these details have been deleted from the list of data being fed in. And the contracts don’t appear to guarantee the NHS enduring control over intellectual property and insights derived from this private-sector data, let alone guarantee this crucial evidence will ever make it into the public domain.
Despite the vagueness of much of the wording surrounding this deal, the contract does indicate that Palantir is allowed to negotiate its own arrangements on intellectual property with third parties going forward. Which third parties are not specified.
Two firms that are named in the documents are the global consultancy giants McKinsey and Deloitte. Experts I’ve spoken to worry that these deals won’t free hospital managers from the armies of consultants who have been shown to undermine NHS efficiency. Instead, by privatising oversight of our healthcare systems, the partnership with Palantir and Faculty could well bind the NHS ever closer to their data-driven reports and presentations. But even Big Data lacks crucial context. Data is not knowledge, nor wisdom.
COVID-19 has taught England that you can’t run a health service on lean working and a drive for efficiency at all costs. But that’s what data analytics, consultancy and Silicon Valley culture is all about. Beneath the sexy but as yet largely unfulfilled promises of robots diagnosing cancer, the majority of health AI – and certainly Faculty’s focus, to date – deals with more mundane issues, like ever-tighter workforce management and scheduling.
Last year I went to San Francisco to investigate supposedly ‘cutting edge’ technological healthcare. Demoralised nurses described acute wards run by computers that summon nurses and doctors on zero-hours contracts when enough patients’ vitals worsen. I dread to think how these tight-stretched systems coped when their wards flooded with COVID patients.
An over-focus on tech and data as the way of “saving the NHS” distracts us from the discussions we really need to have. COVID has forced us as individuals to wrestle with deep ethical issues, like what we’re prepared to sacrifice to look after more vulnerable people, how we reward those who do, whether we’re OK with the poor being so much harder hit by this virus than the rich, and whether people should be allowed to profit from pandemics.
These are political questions that need political responses, not answers churned out by the latest oracle with no interest in justice.
Over the last decade, data giants have replaced oil companies as the world’s biggest firms, with health data a particularly valuable fraction. Of course data firms, and hybrid health/data firms, are desperate to access the vast well of information at the heart of the NHS. That doesn’t mean we should give it to them. And with the government preparing to auction off the country’s assets in a post-Brexit US trade deal, we need to be vigilant.
Get our weekly email