NHS IT project is dead, but why do large IT projects fail? Part 4

Following the unsurprising news that the NHS National Project for IT has been dropped I thought it apt to blog some of the views I have recently had provided to me for an unrelated feature I am working on. The feature, which will appear in two parts on Computerweekly.com soon, asks the question: Why do large IT projects fail?

I started with the comments made by Brian Randell. Randell is a professor of at the School of Computing Science at Newcastle University. He was a member of the group of academics who became concerned about the UK National Health Service’s National Programme for Information Technology (NPfIT).

Then part two came from Anthony Finkelstein. He is professor of software systems engineering at University College London (UCL) and dean of UCL Engineering. Active in industry consulting, he is a fellow of the Institution of Engineering and Technology (IET) and the British Computer Society (BCS).

Part three, was from Yann L’Huillier, group CIO at financial services giant Compagnie Financiere Tradition, who has also headed up IT at several of the world’s t stock exchanges.

Today in Part 4, James Martin, the former IT COO Europe at investment bank Lehman Brothers, gives us his views.

James now managing director at start-up Firmrater. Over the past 17 years he has worked for several retail and investment banks, often in the IT COO role. He has been involved in hundreds of large IT projects including three global Year 2000 programmes.

He says: “In my experience the key reasons for IT project failure have been consistent across firms and around the world. My top 5 pitfalls are: Lack of robust business requirements at the outset, leading to unrealistic IT project budgets and timescales; business sponsorship and participation start off strong and then tail off, leaving the IT project drifting; red herring stakeholders’ frustrating a project by raising numerous side issues and minor concerns; the world outside moving on, forces a project to be re-defined during its course so it never really ends, it just runs out of steam; and the administrative burden imposed on the IT team eats more resource than technical development work.

The large projects which have been most successful tended to be externally visible to customers, regulators, the public and media. I’ve also seen ‘best practice’ lead to ‘worst result’ projects far too often and I believe that’s the root cause of the problem: process has greater emphasis than outcome and that’s not going to get a project over the line.”

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

What is perhaps most baffling about all of the hindsight about NPfIT is that ministers and civil servants did not use more due diligence BEFORE committing our money.

While tools which measure code quality, measure performance against budget and align IT supplier contracts are not the sexiest of purchases, they can prevent project creep, missed deadlines and expensive contractual disputes.

The ending of the era of the UK Government sponsoring the world's largest civilian IT projects may be over, but the technology to stop mini NPfITs which may add up to even more wated tax revenues, remains available. It should be mandated for all government contracts.

Basically, the simple fact is the bigger something is the longer it takes to build. Even if it gets built, it's then more difficult to evolve and less agile to change. We know that technology does not stand still for long, so I can't believe anyone truly believed that a programme of this scale would ever deliver. There are economies of scale that can deliver efficiencies and benefit, but there is a point at which organisations and solutions go beyond the economies of scale and hit the diseconomies of scale due to the loss of manageability.

Our strategy at UNIT4 is to continue to support our NHS customers to grow "local" shared services and collaborations working in local health economies, while at the same time helping new hubs establish, underpinned by leading edge technology that embraces continued development and change. Local collaborations have over time proven to deliver benefits from collaboration while at the same time staying in touch and understanding local economies, priorities and demands. We are now working with a number of customers who are developing "hubs" for such service provision across the country.

Additionally, our licence models enable organisations to retain the agility to move in or out of shared services, or indeed from one service to another, encouraging a competitive environment where good quality, efficient and cost effective services can continue to develop. National agenda's often stifle development and innovation to drive such efficiency.

Steve Haines

Health & Emergency Services Sector Manager

UNIT4 Business software Limited

Steve from UNIT4 seems to be talking along much the same lines as the BCS recommended in its 2006 report "The Way Forward for NHS Health Informatics":


There has never been a lack of understanding in the industry of the problems associated with large IT projects, especially in the public sector, and there were plenty of sceptical voices over the grand NHS IT project right from the start.

It's just that politicians don't want to hear that their grand pet project is destined for disaster, and civil servants don't want to acknowledge this either, while the bloated multinationals who dominate government IT are hardly going to discourage their customers from spending lots of taxpayers money on ludicrously expensive "Big Bang" white elephants, are they?