Profiling tools can eliminate the need to clean data errors manually.
Pretty much every IT manager will support the notion that business data is valuable only if it is accurate and timely. It is also clear that the success of high-profile enterprise-wide initiatives such as customer relationship management, business intelligence and supply chain management are subject to good quality and well-integrated source data that is fit for purpose.
However, failure rates continue unabated. According to the Data Warehouse Institute, data quality problems cost US businesses £338bn a year. Analyst firm Gartner has predicted that until 2005, more than 50% of CRM deployments will suffer limited acceptance, if not outright failure, because of a lack of attention to data quality issues.
Will management, budget holders and users continue to tolerate this? A number of leading organisations have found a way to tackle the problem.
Abbey National confidently expects that its approach to data quality will save £15.2m over three years. The Ministry of Defence's Department of Logistics quotes £20m. The Carphone Warehouse expects to recoup its data quality investment in under 12 months. All these firms expect to enhance the benefits from data-dependant business applications and win them more swiftly and at less risk. And all are reporting that their IT teams are experiencing fewer problems. How are they doing it?
Most organisations manage data quality at a tactical level, typically by department. When data from across departments and systems needs to be brought together, for example in a CRM project, teams hurriedly attempt to find ways to integrate it. On discovering that the data from the various sources is of different formats, standards and quality, the exercise turns out to be far more complicated and risky than anyone had planned.
The resulting "integrated" data often leaves much to be desired in terms of integrity. User acceptance, business effectiveness and customer goodwill suffers and the return on investment of the strategic new application is severely damaged.
Abbey, the MoD and the Carphone Warehouse are different because senior business and IT management have taken the approach that IT teams will tackle data quality enterprise-wide as a strategic, rather than tactical, project.
Using the latest breed of powerful and scalable data profiling tools, IT teams in any large organisation can quickly understand data structures and identify issues they did not even know to look for. Data profiling tools can locate missing data, errors, duplicates and inconsistencies and can drill down into the very data itself to hunt out trouble spots. Profiling tools can create rules for improving and supplementing data from external sources. The best data profiling and data quality tools are integrated to exchange information and are 100% accurate and cut manual effort by 90%.
Approaching data quality product suppliers with a test project to identify issues across a set of complex and large data sources will almost certainly uncover a few nasty surprises, but these are better known in advance.
One company I worked with thought its data was at least acceptable. A test quickly revealed it had several tens of thousands of customers fewer than it was reporting and integrity issues with data relating to customers that did exist. Luckily, it had not yet relied on that data to support an expensive CRM investment.
Ed Wrazen is vice-president of operations EMEA at data quality specialist Trillium Software
This was first published in May 2004