Government auditors last week published the results of a rare year-long investigation into what makes IT-based change programmes successful – and the report vindicates Computer Weekly’s three-year campaign for more open communications and accountability.
Among the successes cited by the National Audit Office was a £297m system built mainly by EDS to pay pension credits, using the slogan “Lifting pensioners out of poverty”.
The report said, “After initial difficulties the Pension Credit Team built strong open relationships with and between its main suppliers… Each side was open about the risks they perceived in the programme.”
It also emphasised the need to hold people and departments to account. “Many of our recommendations focus on the need for greater stewardship and accountability within individual government departments and across Whitehall more generally,” said the National Audit Office.
In another case study, which summarised a £48m system to speed up criminal justice systems in Northern Ireland, officials took the innovative step of paying £250,000 to each shortlisted supplier. This enabled the set-up of pilot projects by which their performance could be measured before entering into large contracts.
The report also highlighted the Clinger-Cohen Act – US legislation that compels the public sector to adopt good practice. Computer Weekly has campaigned for similar legislation to be enacted in the UK.
The National Audit Office said the US act mandates the employment of chief information officers who have a set of “key responsibilities enshrined in legislation and clearly defined duties to ensure a consistent approach across federal government”.
The need for a UK version of the act is underlined by the report. As well as highlighting the reasons for successes in projects in the public and private sectors, it identifies deep-rooted systemic weaknesses in the running of major IT-based programmes in central government, including failures of internal and external communications, not adhering to good practice, and skills shortages.
Such fundamental defects raise questions about whether anyone knows if the £2.5bn spent each year on IT by central government is a generally good or bad use of public money.
Auditors have struggled to find examples of success in central government – most case studies are of smaller-scale projects, which have clear benefits that can be described in one or two sentences. There is not a single example of a successful project at, for example, HM Revenue and Customs, which spends about £250m a year on IT.
In often diplomatic language, the report also picks out some anarchic practices. It finds that supervision is lacking, information about progress or failure is not passed to the people who need to act on it, ministers are not always adequately informed, and large numbers of senior responsible owners – those in charge of projects – spend less than 20% of their time in the role.
Indeed, about half of those owners are doing the job for the first time, often running what the government officially describes as “mission-critical” programmes.
The most striking findings do not end there. Auditors found that there are not enough people with the right skills to manage critical, high-risk programmes – particularly worrying since the government is currently running 91 mission-
critical projects and programmes.
These include the ID cards scheme, estimated by government to cost about £4.5bn over 10 years, and the NHS’s £12.4bn National Programme for IT.
Concern about a lack of skills and business resources was the biggest worry for those carrying out Gateway reviews – regular independent reports on risky IT ventures by a team appointed by the Office of Government Commerce.
By July 2006, according to the latest available figures from the National Audit Office, four out of five mission-critical projects were at the stage of red or amber warning lights in the Gateway reviews. But a red light does not always stop a project for a full review – the failed Single Payment Scheme for farmers, run by the Rural Payments Agency, carried on despite three successive red lights at Gateway reviews.
The auditors also found that some business cases are less than professional, written solely as a means to secure funding, rather than setting out how business change will be achieved, what the benefits will be and, critically, what machinery will be put in place to ensure benefits are achieved.
One auditor said that business cases were sometimes approved on little more basis than that they were written by those who are “good at writing business cases”.
And although Gateway reviews are supposed to be mandatory, some departmental executives pick and choose whether to do them. Only a tiny number of projects are subjected to Gateway level five reviews, which would have established whether the department or agency’s IT programme or project had made any real difference to public services.
Of all the projects that pass through Gateway reviews, only 5% are subjected to level five checks.
All this means that departments have no real idea if the money they are spending on IT is being used wisely. But the gulf in public accountability is not described as such by auditors, who prefer the language of diplomats. They say in their report that there is a “clear incentive – and need – for departments to be in a position to know how well they are using their assets”.
A further deficiency highlighted by auditors is that the Office of Government Commerce – whose chief executive, John Oughton, reports to the prime minister – does not “monitor the status of individual programmes and projects over time to discover any apparent shortfalls in performance and to ensure that the necessary interventions are identified and monitored”.
In contrast, successful projects were often characterised by an openness and a willingness to accept bad news.
In the case of a Payment Modernisation Programme, also involving EDS and the Department for Work and Pensions, ministers met representatives of voluntary and community groups to hear their concerns – which turned out to be more useful than ministers reading an optimistic briefing note on the programme’s benefits and risks prepared by civil servants.
The report’s researchers went to considerable lengths to make the NAO’s findings authoritative. They spent a year conducting research, including a visit to the Government Accountability Office, the National Audit Office’s US equivalent. They finally chose 24 success case studies.
John Bourn, head of the National Audit Office, said, “IT projects in the public sector have too often been associated with failure – this report provides an opportunity to change that.”
But it appears unlikely the report will be acted on. The National Audit Office has previously issued recommendations to pre-empt failure of IT projects, in reports that date back to the 1980s, but, as auditors have themselves said, lessons tend not be learned.
Why is the NAO reporting on government IT successes?
MPs on the House of Commons Public Accounts Committee have in the past examined many failures and identified what has gone wrong in government IT projects.
This report from the National Audit Office aims to draw out lessons from successful central government IT programmes.
In the NAO’s own words, “The purpose of this report is to demonstrate, drawing on a range of case studies from central government and elsewhere, how such successes have been achieved, to enable lessons to be transferred to new and existing programmes and projects in government.”
What are Gateway reviews?
All high-risk IT-based projects in central government are put through Gateway reviews to assess their progress and the risks they carry. The reviews are conducted by teams from the Treasury’s Office of Government Commerce.
There are five types of OGC Gateway reviews during the lifecycle of a project – three before contract award, a fourth just before going live, and a fifth that is supposed to confirm a project’s business benefits.
The National Audit Office has found that although all Gateway reviews are mandatory, some departments and agencies do not put their projects through all the necessary reviews.
Comment on this article: firstname.lastname@example.org