Unanswered questions on NHS IT

The National Audit Office report into the National Programme for IT in the NHS leaves gaps over the cost and scope of the project, systems delivery, and the buy-in of clinical staff

The National Audit Office report into the National Programme for IT in the NHS has, for reasons that are unclear, omitted to mention some of the most challenging aspects of the programme.

Some important subjects are mentioned in passing or not explored by the report, the contents of which were the product of prolonged negotiation between the National Audit Office and the Department of Health.

Questions not fully covered by the 60-page report range from the aspects of the cost and scope of the project to systems delivery and oversight of the project’s progress.

Cost and scope

The National Audit Office’s report makes no mention that a Downing Street meeting, chaired by the prime minister Tony Blair in February 2002, gave the provisional go-ahead for an NHS national IT programme that would last two years and nine months. No mention was made of how the NPfIT came to be a programme lasting at least 10 years.

Other key issues that the National Audit Office report did not mention include:

● A “Project Profile Model” completed by health officials in 2002 to assess the risks of the NPfIT, which scored the programme at 53 out of a maximum possible 72 and put the total whole-life project costs at £5bn. Any score of 40 or more represents a high-risk project. The National Audit Office report’s estimate for the total whole-life project cost is now between £12.4bn and £12.6bn.

● The business justification for the NPfIT was not produced in draft form until the summer of 2003, a year after the programme had successfully passed a Gateway review – an independent assessment by the Office of Government Commerce of the project’s feasibility.

● How much money the Treasury has committed to the programme so far and whether there is certainty that the remainder of the funds will be made available.

● Whether the £12bn estimated cost of the programme represents value for money, or whether the hundreds of millions of pounds spent so far represent value for money.

● Whether a potential £1.5bn for central management costs is reasonable.

● The costs to NHS trusts of the repeated deferrals and cancellations of dates for going live with new nationally-bought systems. In many cases old contracts have had to be renewed, sometimes at a higher cost than before, or interim systems acquired, staff stood down, planning time wasted, and training repeated.

● There is concern among NHS trusts over the affordability of the programme in general over the medium or long term. Connecting for Health has itself said that the NHS funding shortfalls are a “hot topic”. But the report reflects only the concern within NHS trusts about whether they can afford new x-ray technology – picture archiving and communications systems. It does not mention uncertainty about the local funding for the national programme in general.

● The extent of financial commitments by the NHS to suppliers, or the level of fines faced by the NHS if it breaches the contractual promises made on its behalf by the Department of Health.

● Whether the NHS has paid any fines to suppliers or whether any penalties are due to be paid.

● Suppliers may be paid most of the contract price even if NHS staff do not use the new systems. The National Audit Office report commends Connecting for Health for ensuring that up to 15% of the supplier’s charges are based on usage. But it does not mention whether the supplier can be paid up to 85% of the contract’s value if the systems are not used.

● The cost of compensation paid to supplier EDS for cancelling its contract to provide e-mail services, or whether the replacement service from supplier Cable & Wireless has attracted many more active NHS users than is likely to have been the case by now if EDS’ contract had not been cancelled.

● Why exactly BT was paid an extra £69m for broadband “N3” connections, why this was not included in the original contract, and why a further £90m is necessary for integration and why this was not included in original contract.


Any major IT project needs thorough oversight. Its progress should be transparent to the sponsors of the project and, if new systems are to be adopted enthusiastically, to the eventual end-users of what is supplied.

The National Audit Office report is part of that process, but it does not refer to important studies that could allow others to evaluate the project’s progress. Missing from the report are:

● The results of Gateway reviews by the Office of Government Commerce. The NPfIT has been the subject of several Gateway reviews, which assess how a project or programme is progressing at various stages. In the past, the National Audit Office has reported details from Gateway reviews on, for example, IT systems to support the Criminal Records Bureau and tax credits.

● The results of independent reviews commissioned by the government, such as a report on the NPfIT by management consultancy McKinsey.

● Registers of the NPfIT’s risks and how each risk is assessed to reveal the effect on the future of the programme should the risk become fact.


The audit office reported on the successful deployment of some core parts of the NPfIT and noted delays to others, including the integrated NHS Care Records Service, and slow take up of the politically sensitive Choose and Book system. The report does not mention:

● Most of the 261,983 bookings of hospital appointments using a Choose and Book system up to
3 April 2006 were made on the telephone, in many cases because GPs were unable to arrange bookings online.

● A shift away from national or “strategic” systems towards “tactical” or standalone technology in local areas, in part because of the difficulties of providing a country-wide networked system. This move is reflected in the board papers of NHS trusts but not in the report of the National Audit Office.

● The difficulties of an implementation at Nuffield Orthopaedic Centre – although audit office staff visited the hospital earlier this year. The trust was the first in the south of England to go live in a roll-out of a basic version of the national programme’s Care Records Service – a version which does not share records with other hospitals across England.

The implementation led to some operations on patients being postponed. The go-live has also led to changes in the implementation plans for the south of England, in part to allow for extra testing. In May, the trust’s director of nursing and operations said Nuffield in some cases had been unable for nearly four months to “identify the patients who need to be brought forward for treatment within national access timescales”.

● The danger of systems for electronic health records being installed but not being used – an issue raised by government health officials in the US.

● Whether the long-term aim of a national care records service, the main component being a shared electronic health record, is likely to materialise.


The National Audit Office report pointed to criticisms by some clinicians and other NHS staff of the NPfIT, but there are important omissions in what it said. For example, there is no mention of:

● The existence and work of the National Clinical Advisory Board, which was disbanded after the premature departure of Peter Hutton its chairman. Hutton and the board were in part responsible for ensuring that the NPfIT was what clinicians wanted and needed, but were critical of aspects of the programme.

● A “key message” from a Mori survey of NHS staff commissioned by the NPfIT. It found that those interviewed, including doctors, “are much more favourable towards the future goals of the programme than they are towards the programme in its current shape”.

The audit office said it may do another report on the NPfIT in the future.

The challenge of objectivity

The National Audit Office received a plethora of letters, papers and other documents on the NHS’s National Programme for IT (NPfIT), some of them containing strong criticisms of aspects of the scheme.

But few of the criticisms have made their way to the main body of the National Audit Office’s report on the programme. The report is generally positive about the management of the NPfIT, and the criticisms of those who submitted information have been diluted and summarised in a generalised form in an appendix at the back of the audit office’s report. 

Those who provided information to the National Audit Office included members of the British Computer Society, the Worshipful Company of Information Technologists and people who have represented organisations set up by Connecting for Health, which runs the NPfIT.

In understanding what is important, the National Audit Office staff are likely to have been helped by having placed one of their senior executives as an observer on the programme board of Connecting for Health for two years. But this has its pros and cons. The placement gives auditors a valuable insight into the programme’s vicissitudes. It also allows the audit office to give Connecting for Heath’s executives an insight into the lessons from other government IT projects.

But it leaves the National Audit Office vulnerable to a perception of a potential conflict of interest. Could the National Audit Office criticise a programme that has been advised by one of its senior executives, even if he did not take part in decision making?

In the end, the audit office has published a report which contains less criticism of the programme than some speeches and media interviews by Richard Granger, chief executive of Connecting for Health.  Where the report does criticise, its comments are mild or they tend to be counterposed by a positive statement. In contrast, the praise is effusive. 

One of the biggest difficulties for the National Audit Office has been reporting on a project that is not yet half way through its 10-year life.

Usually the National Audit Office conducts post mortems on projects that have ended, for good or ill. To what extent do auditors want to criticise a project that is continuing, and in which the support of the stakeholders is critically important to success?

Health officials in Whitehall are sensitive to criticism, in part because they do not want to feed widespread scepticism over the work carried out so far on the NPfIT. And auditors, by writing an uncritical report, cannot be accused of fuelling scepticism among clinicians.

But if the National Audit Office writes an unbalanced report, which reflects the strengths of a programme and few of its weaknesses, is it doing its job well?

More information: www.computerweekly.com/naoreport

Lessons from history

The National Audit Office has produced many reports into large, expensive government IT project failures and its work has fuelled a number of reports and best practice guidelines that are used today to improve the delivery of government projects.

Its report on the National Programme for IT in the NHS does not link the lessons learnt from earlier inquiries to what is happening on the NPfIT. Examples include:

● A recommendation on the importance of aligning working practices to new systems. This was a central lesson of a report on the failure of Libra, a project to provide a national case management system to magistrates courts. No mention is made in the audit office’s NPfIT report of the fact that contracts were signed, and software written, without agreement among doctors over what changes doctors and nurses would need to make to their working practices to make best use of the new systems.

● A report of the Cabinet Office in 2000 which recommended that a project or programme’s senior responsible owner “should remain in place throughout, or change only when a distinct phase of benefit delivery has been completed”. The report on the NPfIT lists the changes of senior responsible owners but does not say this conflicts with good practice as set out by the Cabinet Office. Nor does the audit office report call on the NHS to settle on one senior responsible owner from now on.

● The National Audit Office’s report mentioned that local procurements have represented poor value for money – but it made no mention of the failure of national systems such as Read Codes 3, and Hospital Information Support Systems, both of which the audit office investigated in the 1990s.


Read more on IT project management