Why government doesn't keep statistics on IT failures

A blog reader asks:

“Do you know if anyone has done a report on the following questions? Shouldn’t they have?

1. How many government computer projects intended to provide nationwide cover have been delivered on time and/or to budget and/or perform roughly to spec)?

2. What proportion of the total relevant budget (i.e. the department or sub-department the money comes from – e.g. Libra came out of overall Department of Constitutional Affairs’ budget but may have been ring-fenced in magistrates courts’ budget) do the failed contracts represent?

3. Is there a correlation between overspend and/or non-performance and the company doing the IT and/or the type of company chosen and/or how it was chosen (e.g preferred bidder, etc)?

Any information gratefully received

My response:

These are good questions that ministers should be asking and, to my knowledge, don’t; or if they do they are likely to be told by senior civil servants that the information is not available.

One problem is that information doesn’t appear to be gathered centrally on what projects are doing well and what are not. A bigger problem is that, within Whitehall’s vocabulary, words such as “failure”, “disaster” and “problem” are considered rude unless they refer to the past, an external supplier or something that doesn’t involve central departments.

And it’s permitted for HM Courts Service to talk about Libra as a “failing” project only because it has now been rescued.

So there are no statistics on government IT failures, largely because failure is a concept dreamed up by the media. And stars are little lamps that sometimes fall into the sea.


IT projects – links to some of the more important reports on mistakes, incompetence and lessons learned

HM Courts Services turns around £447m “Libra” IT system

Statistics and the NPfIT “success”

Libra – lessons learned from its bounce-back