This week, on Tuesday 14th November 2017, the parliamentary Science and Technology Committee took oral evidence on the use of algorithms in decision making.
BBC News reported on the session in this article – Police warned about using algorithms to decide who’s locked up:
Police should not keep suspects locked up because a computer program has told them they are likely to be offenders, a human rights group has told MPs.
Algorithms that predict whether someone is a criminal based on past behaviour, gender and where they live could be “discriminatory”, Liberty said.
The human rights group was giving evidence to the Commons science and technology committee.
The MPs are investigating the growing use of algorithms in decision making.
They are concerned businesses and public bodies are relying on computer programs to make life-changing decisions – despite the potential scope for errors and misunderstandings.
Durham Police have already launched a system which uses algorithms to help decide whether to keep a suspect in custody.
The Harm Assessment Risk Tool (HART) uses historical data on offending to classify suspects as low, medium or high risk of offending.
The tool uses information such as offending history, the type of crime a suspect has been accused of, their postcode and gender.
But Silkie Carlo, Senior Advocacy Officer for Liberty, warned such systems should be seen as “at best advisory”.
She pointed to evidence from the US which suggested algorithms in the criminal justice system were more likely to incorrectly judge black defendants as having a higher risk of reoffending than white defendants.
Durham Police stress that use of the algorithm’s decision is “advisory” and officers can use their discretion.
But Professor Louise Amoore of Durham University warned it can be “difficult” for a human “to make a decision against the grain of the algorithm”.
Ms Carlo said Liberty had requested data from Durham Police about how common the use of human discretion was in decisions using algorithms, but the request was rejected.