Opinion

Thought for the day: Police must get facts right

New Asset  

The Soham inquiry has highlighted data-sharing problems within the police. They must now focus on existing data quality and bring it up to 100% accuracy, says Adrian McKeon. 

 

 

 

One of the most disturbing facts to emerge from the Bichard inquiry into the Soham murders is the apparently chronic inability of the 43 police forces in England and Wales systematically to share data electronically, instead of via fax, telephone and in writing.

Ian Huntley slipped through the gap created when police forces failed to cooperate, but the Association of Police Authorities, Her Majesty's Inspectorate of Constabulary and the Association of Chief Police Officers could not explain why. Their lack of intervention - Huntley's details of sexual/other offences passed through four force systems and the Police National Computer - was followed by the murders of Jessica Chapman and Holly Wells.

It remains to be seen whether the Bichard report will force the police to share data effectively. Witnesses cited a myriad of political and bureaucratic barriers to police data sharing. But are we being too hard on the police?

Probably not. They have had the budget and failed to deliver - Bichard was incredulous nothing had been done. Questioning during the inquiry never got beyond issues relating to data sharing. The question, "Is the data shared accurate and where is the evidence for this?" was never adequately explored.

Data-integration failures
Let's put that in context. Gartner estimates that more than 25% of critical data within Fortune 1,000 businesses is inaccurate or incomplete. And even after they identify data quality problems, the majority continue to invest in technology such as business intelligence, CRM or warehouses which are easily crippled by bad data. This probably accounts for the 50%-70% of all data-integration projects which do not deliver on client expectations.

Looking at the many IT initiatives mentioned in the inquiry, the police have persistently gone down the same route - all were to do with technology and none to do with data. Is there any reason to think things will be any different in future? And if that is the state of play in the private sector, why aren't alarm bells ringing all over the criminal justice system or the public sector in general? Will e-government just deliver well-distributed e-junk?

For example, government statistics indicate that there are about two million more patients registered with GPs than the estimated population of 50 million people in England, leading to a strong chance of NHS records linking via a duplicate NHS number to the wrong patient. NHS duplicates can be fatal, but this issue still has not been properly addressed.

A serious chance of error
A ministerial statement in 2001 revealed that there were about 83 million National Insurance numbers for an estimated UKpopulation of 60 million. A 98% accurate multimodal biometric ie a combined iris, skin texture, facial, fingerprint scan authenticates who I am. But check me against the NI database and there is a serious chance the wrong NI/PAYE data will link to the biometric. Will the accuracy of data held on us by government bring down the biometric ID card?

Gartner's message is clear. Throwing technology at data quality issues does not solve the problem. Before you can have effective data sharing you must have data of proven accuracy. Otherwise intelligence applications which might have prevented the deaths of Jessica and Holly will just deliver junk.

Right now, when bad data comes out of police IT, analysts cannot tell what the problem is, where it is or how to fix it. Yet that is the data they use to target resources. Bad data is not a problem if you know about it. It is lack of measurement of bad data which is the problem.

US computing pioneer Grace Hopper was right when she said in 1942, "One accurate measurement is worth a thousand expert opinions".

A key part of the police IT agenda must now focus on measuring existing data quality and factually determining the gap between a force's current data accuracy and 100% accuracy. Only then can you determine how this "chasm" translates to gaps in service provision and increased risk to the public. Only then will you stand a decent chance of preventing another Soham.

Adrian McKeon is managing director of Infoshare. He has 13 years' experience of data validation and matching issues in both private and public sectors

Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in June 2004

 

COMMENTS powered by Disqus  //  Commenting policy