The British Army’s data quality and governance programmes have improved cost control and manpower planning, and...
laid the basis for better business intelligence.
The Army’s experience, since the introduction of the Joint Personnel Administration (JPA) computer system in 2007, with solving its data quality problems provides lessons for other, non-military organisations of significant size.
“First, be prepared to recognise your problem,” said Col. Giles Baxter, assistant director, Manpower Systems, Directorate of Manning at British Army Headquarters Land Forces.“Second, the senior management that is most affected by the problem has to own it. An enterprise problem is not just an IT department problem. Third, solving [a data quality problem] will take resources, but you will recoup the investment many times over very quickly. And fourthly, while there are others out there who can help, they can’t do it for you.”
He contended that “any organisation that is dependent on having the right number and type of people with the right skills at the right place at the right time has a similar problem. The difference with the Army is that we can only recruit from the bottom -- we don’t recruit colonels from civilian life.”
Last year, the Army Personnel Data Management Organisation (APDMO), which today has 10 civil service members of staff, won the Data Governance Best Practice Award, given by The Data Administration Newsletter.
The Army’s data quality improvement programme began in earnest in December 2008 with a data management and governance program headed up by Detica, a consultancy firm. Detica helped create the APDMO to provide an enduring focus on data quality. The consultancy, said Baxter, also got the Army to “look at its data properly,” something it had never before done. A project using Detica’s DQ Insight tool revealed much of the data to be poor, he said.
The Army began its efforts to solve data quality problems with two activities: designing the APDMO and formalizing the process for managing data quality issues, including the appointment of “data champions” from each affected business area. A team of data analysts investigated the scale, complexity, root causes and potential solutions to data quality problems and a Remedial Actions Programme (RAP) cleansed high-priority data elements.
Root cause of data quality problems
The Armed Forces has about 170,000 full time servicemen, of which about 100,00 are Army, in addition to over 30,000 part-time reservists. It has hundreds of thousands of pensioners, 50 different nationalities -- foreign and commonwealth as well as Irish -- 17 ranks from private to general, 250 trades and complicated career models. Naming conventions are complex, with a private soldier potentially being called a private,or guardsman, gunner, sapper, driver,and so on. And it spread across the world, often in areas inaccessible to Internet or mobile phones.
But the essential problem for manpower planning is that the Army manpower is akin to a control system with long lag times, Baxter said. “We are still living with decisions made five, 10 and 20 years ago.”
Baxter’s role over the past six years has been to manage changes to that system, at the heart of which lies the data. He reports to Brigadier Richard Nugee, director of Manning, who, in turn, reports to the adjutant general, Lt. Gen. Mark Mans, the member of the Army Board responsible for personnel. The adjutant general is also responsible for personnel budget, as well as human resources. “As a result of all the work we have done on data, it makes sense that the person who is responsible for policy and plans is also in charge of the money,” Baxter said.
Nugee and Baxter have sought to explode what they see as “myths” about the Army. One of these is that “the Army is a simple system, and so the data in the context of our management processes could be assumed to be simple. Not true,” Baxter said. Moreover, manpower costs are higher in the Army (74%) than in the Navy and Air Force -- 27% across Defence as a whole. And while the RAF and the Royal Navy are configured around equipment, the Army is configured around manpower.
“Boots on the ground is our business,” said Baxter.
The second myth is if you tell people what to do, they do it. “It’s not like that – the organisation is too myriad,” he added. “We achieve by all functions and departments working together. The battlefield model does not translate simply elsewhere; for instance, the quality of decision is often less important there than is the speed with which the decision is made.” The third is that fixing data is just a matter of fixing the computer system: “Technology was just a part.”
Data quality problems triggered by new system
In 2006-07, the introduction of the tri-service JPA disclosed the data quality problems. The prior system, Unicom, “was a data store with none of the sophistication of the Oracle PeopleSoft that EDS implemented in the JPA,” Baxter said.
The JPA was built around the process layer, not the data layer, he said. And so “there was no single table of data about a soldier. The bit that we collectively forgot is that the data in JPA drives other strategic planning processes, and manpower planning is one of them.
“We had very little reliable data to even say how many people were in the Army,” Baxter said.
The manpower planners reporting to data customers, like government ministers and the top brass, were the key people who suffered. Indeed, prior to the data quality and data governance programme taking effect, the Army Manning Board “would spend three hours discussing the numbers,” Baxter said. “It is now down to 20 minutes being briefed.”
Recession leads to overstaffing
At the end of 2009, after over a year of recession during which few left and many wanted to join the Army, the service found itself moving rapidly from a position of being thousands undermanned to being over its endorsed limit of 103,000, and still heading upwards. That a meant cost pressure in excess of £100m.
“We realised we had overestimated the number of people who were going to leave because the data about ‘run-outs’ [soldiers coming to the end of their contract] was wrong,” Baxter said.
However, because of the data quality programme the situation could be addressed. The Army scaled down recruitment in 2010 to compensate, saving £40m -- the entire data quality programme had cost £4m.
“Flash to bang was very quick. It was good to do the data quality piece first, ahead of the data governance.”
Col. Giles Baxter
The manpower planning directorate is now, Baxter said, moving from spreadsheets to an Oracle-based data warehouse with series of tools sitting on top. These include Guildford-based Futura Simulations’ forecasting applications set. And Baxter reported that the Army has signed a contract with the SAS Institute for a full set of data mining and reporting capabilities. “Why have we done that? Because our data is now good enough to answer the basic questions like, ‘How many people are in the Army?’ ‘How many of them are Royal Engineers?’ ‘How many are reservists?’ We will be able to do so much more.”
In relation to future plans for data management, he said, “the analogy we are using is that of Salisbury Cathedral, which is built on an undercroft. We now have, like the cathedral undercroft, columns and a vaulted roof. When we started we were building columns piecemeal -- the data work, the information systems, the business process changes. So now they have come together. We’ve built a really solid foundation to deliver information that is more fit for purpose for decision making.
“Previously, we simply daren’t have brought together our budgets and manpower planning together because we didn’t trust our data. Now the adjutant general can do that, and has done from April 2010 onwards.”
2010 was the first year when Land Forces has not had to reprogramme its budget because of mistakes in manpower forecasting, he said. A northern European army has been in touch to find out about data quality improvement. “And within the Army our training, logistics and equipment organisations are looking to us. We have trailblazed within the wider HQ. The issue is always who takes ownership.”
The original programme to fix data quality problems was done quickly and aggressively. “Flash to bang was very quick. It was good to do the data quality piece first, ahead of the data governance. I would absolutely recommend that; otherwise it is jam tomorrow.”
The major payoff may well be the quality of the Army’s contribution to the government’s Strategic Defence and Security Review, which plans to reduce the size of the Army so that by 2015 the structure is aligned. “We could not have achieved any of that planning in the timescales required by ministers without the data piece,” Baxter concluded.