openJustice

Price and prejudice: automated decision-making and the UK government (podcast)

We spoke to lawyer Cori Crider about why automated decision-making in government is a major challenge of our time.

Oscar Rickett
4 March 2020
An algorithm can be so complex that it cannot not be meaningfully comprehended by lawyers, judges or the public at large. This is one of several challenges in responding to built in biases in automated-decision making.
|
Photo by Zoltan Tasi on Unsplash. Some rights reserved.

The first legal challenge against the government's use of an algorithm is currently going through the UK courts. The algorithm, which was rolled out in secret, is being used by the Home Office to “stream” visa applications according to their supposed level of risk.

In assessing the level of risk, the Home Office take into account the applicant's nationality. This essentially leads to "speedy boarding for white people" say the Joint Council for the Welfare of Immigrants and not-for-profit Foxglove, who are bringing the litigation.

In the fourth and final episode of our podcast series, we spoke to Cori Crider, a lawyer and the Director of Foxglove, about how algorithms can lead to poor quality and biased decision making. We also hear about the barriers civil society faces in scrutinising automated-decision making and why this is a major challenge of our time.

This podcast is part of the Unlawful State series where we investigate unlawful decision making by the UK government and hear from pioneering NGOs using the law to tackle the problem. See here for more.

Had enough of ‘alternative facts’? openDemocracy is different Join the conversation: get our weekly email

Comments

We encourage anyone to comment, please consult the oD commenting guidelines if you have any questions.
Audio available Bookmark Check Language Close Comments Download Facebook Link Email Newsletter Newsletter Play Print Share Twitter Youtube Search Instagram WhatsApp yourData