Jerry Sliwowski - Fotolia
Uncritical use of algorithms in justice can undermine public trust and individual rights
A Law Society report has raised concerns about the use of algorithmic systems in decision making in the criminal justice system
The “uncritical and unexplained” use of algorithms can put the integrity of the justice system and fundamental human rights at risk, a report by the Law Society has warned.
Use of algorithmic systems in tasks ranging from everyday purchasing decisions through to the administration of justice can bring benefits, but the use of biased or oversimplified data can lead to a series of negative outcomes, the Public policy commission into algorithms in the justice system report has found.
The report is based on evidence sessions from more than 75 expert interviews and more than 80 submissions from academics, police forces and civil society bodies.
Consequences of using biased/oversimplified data can include discriminatory decisions, shallow understandings of complex issues and a lack of long-term analysis, according to the year-long investigation.
As a result, the loss of individuality and autonomy, and human rights such as privacy and freedom from discrimination, could be at risk, the report noted, adding that there is an increasing likelihood of procedural flaws such as unfair trials.
Reduced transparency in decision-making processes is another challenge stemming from inappropriate use of algorithms outlined in the report, leading to a lack of proper scrutiny and greater potential for abuse of power.
The report called for significant investment to support the ability of government bodies to understand where it is appropriate to use algorithmic systems and how to deploy them responsibly.
According to the report, a legal framework should also be used to manage the technologies, built on consensus between supply chain to criminal justice agencies and stakeholders using algorithms.
The protections related to algorithmic systems in the Data Protection Act, as well as existing regulations around fairness and transparency of activities in the justice system, should also be strengthened, the study added.
Algorithmic systems in the criminal justice system should be controlled, amended and subject to public scrutiny, the report noted, in addition to testing and monitoring for relevant human rights considerations.
Other recommendations made in the study include improving oversight of algorithms in criminal justice through the creation of a range of new mechanisms and institutional arrangements.
The Law Society report echoes previous concerns about the use of algorithmic decision tools to support criminal justice.
Research by independent think tank Police Foundation argued that regulation of data-driven technologies should be the norm as volumes of digital forensic evidence rise, along with public demands to engage with the police online, under reduced budgets.
Read more about IT in the criminal justice system
- Use of iris scanning and facial recognition software is part of a wider crackdown on drug trafficking in UK jails.
- PAC doubts justice system transformation programme will be a success.
- Fully digital services is among the highlights of a world-first transformation initiative in the Crown Prosecution Service.