digitaLiberties: Opinion

Suspicious algorithms: time to tame crime-predicting police technology

Garbage in, miscarriage of justice out – especially if you’re poor or Black

David Renton
18 April 2022, 7.00am

Turning bias into data

|
Jochen Tack/Alamy Stock Photo. All rights reserved

Data is never pure. Errors creep in. It’s annoying when bad data dents your credit rating; when it tells the police you’re likely to be a criminal, it could ruin your life.

Criminal convictions have been obtained because courts were told that data was infallible when in fact it was full of mistakes. Indeed, the most important single miscarriage of justice in the UK’s criminal systems in the past quarter of a century began when the Horizon accounting system falsely reported discrepancies and non-existent shortfalls in Post Office branch accounts.

Because Horizon was believed over branch managers,, 736 sub-postmasters and mistresses were prosecuted for fraud between 2000 and 2014. In December 2019, the Post Office settled with 555 claimants for a total of £58m, with the court finding that the Horizon system suffered from around 30 main “bugs, errors and defects”. In spring 2021, the Court of Appeal overturned 72 of the convictions. For some, it was too late: by early 2022, 33 of those wrongly prosecuted were dead, four of them reportedly at their own hands.

Those responsible for the disaster fell upwards: the Post Office CEO was made a CBE for her “services to the Post Office and to charity” and moved on to a role as a non-executive board member of the Cabinet Office. At least, though, you might think that people working in law enforcement would have drawn a lesson from such a remarkable failure of both a computing system and the justice system. Yet criminal law-enforcement bodies are buying more and more electronic applications.

Help us uncover the truth about Covid-19

The Covid-19 public inquiry is a historic chance to find out what really happened.

The broad context is familiar. The four richest companies in the US by share value are Microsoft, Apple, Amazon and Google. They engage in what sociologist Shoshana Zuboff calls “surveillance capitalism”: they collect data about their users and develop techniques for extracting that information and selling it to advertisers. Surveillance capitalism offers those advertisers more effective tools for knowing how we eat, live and shop, and for making us more active consumers of their products.

Less familiar is that those models of technology then go over from the private to the public sector.

Under conditions of neoliberalism, data processing has a certain recurring style. Even for public-sector work, it tends to be done by private companies that guard the content of their algorithms, insisting they must be kept secret from rivals.

The middle-class criminal is invisible to app-driven policing; the working-class criminal is seen and punished

Further, the algorithms generate feedback loops, so that when an error leads to a penalty (for example, if an application wrongly records a person in a database as a criminal suspect), the mistake is copied into other datasets until it becomes impossible to identify where the first error was made.

In recent years, police and probation services have used automated services more and more to help identify suspects or predict reoffending rates. In a report published last month, the Justice and Home Affairs Committee of the British parliament’s House of Lords gives examples: Avon and Somerset Police has bought a tool called Qlik Sense to predict criminal trends; the Home Office uses an algorithm to review applications for marriage licences in order to predict which marriages are ‘sham’ and so to justify refusing visas; and Durham Constabulary has been using a Harm Assessment Risk Tool, which uses machine learning to predict who is likely to commit crimes and should therefore be more closely monitored.

The committee speaks of the “benefits” of these schemes in terms of increased efficiency. It warns, however, that technologies of punishment and control are developing faster than legal resources to prevent injustice. For example, the committee says it’s unusual for a criminal defendant to be told if their investigation is based on a recommendation from a computer application. It also speaks of a “vicious circle”, in which an application pre-emptively identifies groups of people living in particular areas as being at risk of becoming criminals, leading the police to closely monitor their lives, with such monitoring generating data in turn that justifies further investigation.

The authors note, but do not properly explain, that several of the algorithms recently purchased by police forces mimic others that have been controversial. One much-published application, which is widely used in the US (and in Kent between 2013 and 2018), is PredPol, which uses historical crime data to predict, on an hour-by-hour basis, where crime might be committed. Police forces are given maps with boxes marked in red to show where crime should be expected. Sites of past burglaries can be patrolled, and possible crimes pre-empted, until the incidence of that offence – and crime in general – has declined.

We can see the legacy of PredPol in some of the newly acquired algorithms. Qlik Sense, used in Avon and Somerset, mimics its focus on the local geography of crime and its changing local trends.

There are good reasons however why PredPol has become controversial. First, its supporters have insisted that predictive policing will make ordinary citizens safer. That claim has been based, in turn, on the theory of ‘zero tolerance’: that if only you could stamp out petty crime and anti-social behaviour, serious offending would also decline, and that both could be achieved without social cost. “Misdemeanor crimes,” according to Predpol’s training materials, are “seen as the gateway to more serious crimes.”

In his international bestseller ‘Humankind’, Rutger Bregman has shown how poorly thought-through were the studies upon which zero-tolerance policing was based. It began with a series of experiments in which psychologists deliberately created conditions intended to maximise the incidence of crime, and then recorded (unsurprisingly) that it had risen. Meta-analyses of zero-tolerance policing show that there is in fact no correlation between its aggressive approach to petty crime and reduced violence. Increasingly arbitrary and authoritarian policing – inevitable with a zero-tolerance approach – produces not less offending but more discontented people.

The problem is that the most minor offences, such as petty theft or drug possession, are the ones whose enforcement is most skewed by demography. A middle-class teenager is likelier to have their own bedroom, maybe at some distance from their parents’ room, where they can smoke cannabis or drink alcohol underage. A working-class teenager will more often share with siblings and so are more likely to smoke or drink on the street. The middle-class criminal is invisible to app-driven policing; the working-class criminal is seen and punished. Being Black and working class puts you in the police’s gaze still more.

Related story

Football fans cover their faces with masks in protest at the police's decision to utilise facial ID technology as they arrive at Cardiff City Stadium to watch the South Wales derby between Cardiff and Swansea. 12 January 2020.
Clearview AI is offering its controversial tech to Ukraine for identifying enemy soldiers – while autonomous killing machines are on the rise

When advocates of zero-tolerance policing tell officers to divert resources away from the policing of serious offences towards misdemeanours, they are concentrating police resources on the parts of the job where there is the greatest disparity between actual offending and its enforcement.

When the House of Lords committee speaks of a vicious circle of junk data producing bad policing, we can be clear about who is likely to suffer: ethnic minorities and the poor.

To take another example, the Harm Assessment Risk Tool used by Durham Constabulary borrows from an older generation of algorithm, the COMPAS scale, marketed by Equivant, which is meant to calculate an offender’s likelihood of reoffending. In the US, tools of this sort are central to the process of judicial sentencing. In Wisconsin, for example, a convicted person’s score on the COMPAS scale is used to calculate a term of imprisonment.

In 2016, investigative journalism group ProPublica assessed COMPAS by comparing the app’s predictions for 20,000 real-life defendants – half them Black, half white – with what they actually did. “Black defendants,” the study found, “were far more likely than white defendants to be incorrectly judged to be at a higher risk of recidivism, while white defendants were more likely than Black defendants to be incorrectly flagged as low risk.”

COMPAS disputes ProPublica’s findings and insists that its method is robust.

Related story

32215763362_7f2339d8d2_k.jpg
Palantir's software was trialled by the Met Police and the Cabinet Office. The CIA-backed tech firm lobbied Gove, Hancock and others before winning key NHS contract.

Critics argue that the data points employed by COMPAS are not neutral but reflect ‘common sense’ misconceptions of how crime comes about. For example, you can ask a series of people what age they were when they first came into contact with the police, and if you ask only that one question and treat it as a piece of authoritative and neutral data, there may well be a weak correlation between first contact and subsequent criminal history. But if the reason why one group of people are coming into contact with the police at 13 when their contemporaries are first monitored at 18 or 19 is that those police officers were themselves operating with unspoken but real assumptions about racial proclivities towards crime, then what the app is picking up – and potentially reinforcing – is bias on the part of officers rather than any understanding of why crimes happen.

The Lords call for a strengthened legal framework, note that the Council of the European Union is discussing regulation of artificial intelligence and suggest that a similar discussion needs to take place here. They suggest the government should ban the deployment of certain applications without active checking.

Their most important suggestion, however, is that there needs to be a register of algorithms, ideally covering the entire public sector, but at a minimum covering the algorithms whose use affects the criminal justice system. So far, the government appears to prefer a light-touch approach, in which the police can do what they like or, at the most, private companies are encouraged to provide such information as they choose to submit voluntarily.

We have already a Food Standards Agency, whose job is to ensure that foods sold in the UK do not damage the health of the people who eat them. We have a Medicines and Healthcare Products Regulatory Agency. You may not particularly like the thought of the continuous expansion of the regulatory state (and I, for one, do not). But if you do not have a system of checks and balances, then the practical reality is that police power expands and the courts do what they have always done: convict the innocent and remedy the worst abuses of power only after the event, when the public outcry is so loud that they have no choice.

Why should you care about freedom of information?

From coronation budgets to secretive government units, journalists have used the Freedom of Information Act to expose corruption and incompetence in high places. Tony Blair regrets ever giving us this right. Today's UK government is giving fewer and fewer transparency responses, and doing it more slowly. But would better transparency give us better government? And how can we get it?

Join our experts for a free live discussion at 5pm UK time on 15 June.

Hear from:

Claire Miller Data journalism and FOI expert
Martin Rosenbaum Author of ‘Freedom of Information: A Practical Guidebook’; former BBC political journalist
Jenna Corderoy Investigative reporter at openDemocracy and visiting lecturer at City University, London
Chair: Ramzy Alwakeel Head of news at openDemocracy

Had enough of ‘alternative facts’? openDemocracy is different Join the conversation: get our weekly email

Comments

We encourage anyone to comment, please consult the oD commenting guidelines if you have any questions.
Audio available Bookmark Check Language Close Comments Download Facebook Link Email Newsletter Newsletter Play Print Share Twitter Youtube Search Instagram WhatsApp yourData