Mat Hayward - stock.adobe.com
The acquisition and use of digital technologies at scale by UK police is one of three major priorities going forward, said policing minister Kit Malthouse, ahead of the Strategic Review publication.
Speaking during a webinar on the challenges and future of policing held by the Centre for Policy Studies on 22 February 2022, Malthouse agreed with the priorities set out by Micheal Barber, chair of the Strategic Review of Policing in England and Wales, in his forthcoming report, which is due to be officially released in March.
“There is enormous potential in crime prevention – never mind detection – in the acquisition of technology, but like all areas of the public sector, police have not been very good at acquiring technology in a coherent way [or] at scale,” said Malthouse
“There’s a lot of work going on across UK policing, a lot of it is driven by individual enthusiasms – by officers or Police and Crime Commissioners [PCCs] – for particular projects or particular bits of kit. We want to bring some coherence [and consistency] to that.”
Malthouse added that this work had started, with efforts already underway to centralise work around the use of artificial intelligence (AI) for crime prediction and detection, and to upskill the police workforce’s data skills. Facial recognition was also highlighted by the minister as an area with “enormous potential”.
The greater focus on police technology going forward is, according to Barber – whose final report is more than 250 pages long and includes 50 plus recommendations – a response to changing patterns of crime.
“You’ll see that now 40% plus of all crime is fraud, almost all of it online. Sometimes online crime gets dismissed as a victimless crime, but actually these people are losing their pensions, their savings. It’s a devastating crime. And you can see that the traditional bobbies on the beat aren’t the answer,” he said.
Barber added that his report will set out different proposals for “police learning and training development”, which will help officers and staff develop skills in new areas, including data and technology.
On top of bringing in 20,000 extra police officers, Malthouse added that his department would look at “bringing in complimentary skills” from outside law enforcement bodies to use some new technologies and deal with online or cyber crimes: “I think [that] will be critical to success.”
Malthouse previously told the House of Lords Home Affairs and Justice Committee (HAJC) in January 2022 that the use of advanced algorithmic technologies by police should be tested in court rather than defined by new legislation, arguing that new laws would be too restrictive and therefore “stifle innovation”.
During the webinar, Malthouse said that “critical to any innovation is allowing failure”, adding that allowing UK police to “experiment” and use technology “in a safe space…will help build resilience for the future”.
Speaking to the HAJC in October 2021, Karen Yeung – an Interdisciplinary Professorial Fellow in Law, Ethics and Informatics at Birmingham Law School – said a key issue with police deployments of new technologies is that authorities have started using them “just because we can…without clear evidence” about their efficacy or impacts.
On the trials of facial-recognition tech conducted by the Metropolitan Police Service (MPS) specifically, Yeung said the force’s scientific methodology was “very unrigorous”, noting that because procedures were tweaked every time a trial was conducted, “we do not have a stable and rigorous set of data on the basis of these experiments”.
Yeung further noted that the development of crime prediction tools – such as the MPS Gangs Matrix or Durham Constabulary’s Harm Assessment Risk Tool (Hart) – has been equally unrigorous, with historic arrest data being used a proxy for who is likely to commit a crime.
Another major priority mentioned by Malthouse and Barber included setting a “clear and prioritising mission” to narrow the focus of police force’s crime detection and prevention efforts, avoiding the need for police to deal with “social problems” that other services are better suited for.
Asked whether cuts to social services since the introduction of austerity in the wake of the 2008 recession could have played a role in reducing these services’ capacity, Malthouse said it was a question of closer working between different parts of the public sector.
“We are recognising that so the PCSC [Police, Crime, Sentencing and Courts] bill…does have this serious violence duty in there which for the first time places a statutory duty on other public sector organisations, health, local authorities to come alongside the police to try and prevent serious violence in any community,” said Malthouse.
“This is a recognition of the fact that you need a coalition of the willing in every geography to prevent these issues. The police will always have to deal with the consequences, but prevention is much better than cure.”
These measures – which essentially amount to new powers for police to gather and share data on people allegedly involved in “serious violence” – have been criticised by human rights and civil society groups as having the potential to undermine existing data rights and further entrench discriminatory policing practices.
There are also serious concerns, particularly among members of the medical profession, that the obligations placed on a range of public bodies, including healthcare providers, to share data with the police will ruin people’s trust in those organisations and stop them from accessing essential public services out of fear that the information will be unfairly used against them.
The final major priority mentioned was around improving general leadership and professionalism, which it is claimed will help with “cultural” issues highlighted by recent events such as the murder of Sarah Everard by serving police officer Wayne Couzens, or the Independent Office for Police Conduct’s (IOPC) report on “disgraceful” misogyny, discrimination, bullying and sexual harassment within Charing Cross Police Station.
Read more about police technology
- UK police do not have the resources to properly scrutinise their deployments of new algorithmic technologies and receive very little training on how to operate the systems introduced, senior police officers and staff have told Lords.
- Criminal justice sector (CJS) bodies procuring artificial intelligence (AI) technologies should use their purchasing power to demand access to suppliers’ systems to test and prove their claims about accuracy and bias, an expert witness has told a House of Lords inquiry.
- Police facial recognition deployment resulted in four arrests, but questions remain about the necessity, proportionality and legality of the technology’s use.