In February 2002, a secret Gateway review of a huge IT project at the Criminal Records Bureau found all was not well - a month before systems were due to go live.
The supplier Capita had won its largest ever contract, worth £400m over 10 years, to develop and run systems for the bureau. As a result of a "Gateway 4" review in July 2001, the bureau had deferred the go-live date to allow more time for testing.
But a confidential Gateway 4a review a month before the rescheduled go-live date raised several concerns:
- The full end-to-end assembly of the IT production environment would be put in place for the first time just days before going live. This entailed a "substantial risk"
- There was a need to retain key staff for development and test work after the launch
- Training on the live system for staff at Capita and the bureau would be needed after the go-live date
- Progress was still needed on outstanding legislation
- There was a lack of contingency.
Even so, the review accepted that there was "now no turning back" and recognised that, on balance, the rescheduled March 2002 operation launch would go ahead given the "confusion and bad publicity that would result from delay".
There was political pressure to go live. But the final decision lay with the civil servant who ran the bureau, on the recommendation of Capita and after consultation with the supplier and the team that had conducted the Gateway reviews.
It was decided to go live, a decision also supported by consultancy PA Consulting. Once operational, the systems failed to meet expectations for a complexity of reasons. Backlogs of work built up and some people were recruited to work with children without checks on their background by the bureau.
None of this detail was known until public spending watchdog the National Audit Office published a report last month on the bureau's difficulties.
Computer Weekly believes that Gateway reviews should be published to strengthen accountability and help improve the success rate of IT-related projects in government. The publication is also campaigning for US-style legislation aimed at central departments which would further improve accountability on major projects. Last week's Computer Weekly explained in detail why legislation was needed and how the US Clinger-Cohen Act worked. This week the focus is on Gateway reviews.
What are Gateway reviews?
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
The Gateway process is the brainchild of the much-respected Peter Gershon, head of the Treasury's Office of Government Commerce. The main purpose of Gateway reviews is to identify high-risk IT-related projects that are doomed to failure. They have since been extended to cover large-scale construction schemes.
The Gateway process involves six Gateway reviews in the lifecycle of the project:
Gateway zero: strategic assessment
Gateway one: business justification
Gateway two: procurement strategy
Gateway three: investment decision
Gateway four: readiness for service
Gateway five: benefits evaluation. T
here may also be interim reviews and repeated reviews. For example, Gateway zero may be repeated to assess the project strategically at various stages.
Projects move to the next stage only when they are properly prepared, and it has been established that time, costs and risks are being properly controlled.
Departments and government agencies must notify the OGC of projects that are assessed as high-risk. These projects will be reviewed at each stage by an independent team of three to five people appointed by the OGC. Typically the reviewers are civil servants who have a record of success in IT-related projects.
Many reviews take only about three days each - and the report is ready about a week later. Often the projects studied are worth £100m or more.
In projects assessed as medium-risk, the OGC will appoint only a review team leader. For low-risk projects, the spending body will appoint its own review team.
Review reports are confidential and are addressed only to the project's senior responsible officer (SRO) in the spending agency. Only two copies of the report are routinely made, one for the SRO and the other for the Gateway team to extract generic lessons. The SRO, who is often the business head of a department, is under no duty to let anyone see it. The IT director and computer staff have no automatic right to see it.
The report will have a summary conclusion with the status of the project, a list of recommendations at the beginning and a note of interviewees and their roles.
An overall "traffic light" status for the project must be given:
Red: To achieve success the project should take remedial action immediately. It means "fix the key problems fast", not "stop the project"
Amber: The project should go forward with actions on recommendations to be carried out before the next review
Green: The project is on target to succeed but may benefit from the uptake of the Gateway's recommendations.
Has the Gateway review process been a success?
Three months ago, public spending watchdog the National Audit Office said, "Our case studies have emphasised the value of Office of Government Commerce Gateway reviews as a means of providing assurance at key stages to help keep projects on course." But large-scale IT-related projects continue to fail to meet expectations.
It is difficult to assess the effectiveness of reviews in general because they are confidential and the OGC encourages departments to keep them secret.
The OGC's website gives departments details of legal devices that can be used to reject any requests for reviews to be published. In practice, a senior responsible officer (SRO) is likely to share the outcome of a Gateway review internally if it is positive. If it is negative, it may be shared with fewer people.
Richard Barrington, former director at the Office of the E-Envoy, said Gateway reviews should be published at least as synopses. He said that studies undertaken while he was working in the e-envoy's office showed that IT-related projects usually went seriously wrong for non-technical reasons such as inadequate re-engineering of business processes. It was rare for a project to fail because of any serious mistakes by IT directors and their teams, he said. This could be demonstrated if synopses of Gateway reviews were published.
Gateway reviews cannot stop a doomed project; they can only draw the attention of the SRO to weaknesses. A department can go through red and amber lights by saying it is addressing the project's flaws. Senior executives in departments can shape or mould the outcome of a review according to what they tell the reviewers.
MP Richard Bacon, a member of the Public Accounts Committee, said in a debate in the House of Commons last month that he hopes the OGC will "seriously consider publishing Gateway reviews; perhaps the Treasury will give it a nudge in that direction". He added, "The central point is this. If these matters were in the public domain and MPs, members of the public and journalists could read about them, the department might be helped to conclude that it should take a slightly more robust view, and perhaps occasionally stop projects in their tracks."
The House of Commons work and pensions subcommittee is considering Computer Weekly's call for Gateway reviews to be published.