Without greater software IQ, analysts' rhetoric about real-time business intelligence is far in advance of software's ability to deliver, says Andy Hayler.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
The idea of "real-time" business intelligence sounds appealing: as soon as someone in Brazil types in a new sales order, the business intelligence system in central office knows and reacts immediately.
Those who have worked in large corporations will be entertained by the naivety of this, since most large companies would be grateful just to know which are their most profitable global accounts.
The mismatch between fantasy and reality is driven by two factors. The first is that business rules and structures, such as general ledgers, product classification, asset hierarchies, and so on, are not in fact uniform, but are spread out among disparate transaction system implementations.
One survey found that the average Global 2000 company has 38 different sources of product master data alone. And this is after all the money spent on enterprise resource planning.
The second problem is that the landscape of business structures is in constant flux, as groups reorganise, subsidiaries are sold or new companies acquired.
Today's business intelligence and datawarehouse products try to sweep this under the carpet, producing tools to convert the source data into a lowest-common-denominator consistent set that can be loaded into a central datawarehouse. This simplification is understandable, but means that local variations are lost, and many types of analyses are not possible. Worse, if the business structures change in the source systems, the datawarehouses and reports built on top of them are undermined, with any changes to the structure of the datawarehouse taking months to bring about.
What is needed, and generally what the industry has failed to deliver, are products that are comfortable dealing with business change: "smarter" software.
Today few IT systems can cope with a change in the structure of the data coming into the system without significant reworking. The reason for this is in the way that databases are designed. They are usually implemented to reflect how the business is structured today, with relatively little regard to how to deal with future change. Introductory courses on data modeling show "department" and "company" with a "one-many" relationship between them. For example, a department can have many employees, but a person can be only in one department.
This is typical of the way data models are built, yet even this basic model is flawed. Of course it is hard to cater for future and hence somewhat unknown change, but without greater "software IQ" we will be forever patching our systems and discovering each upgrade is a costly process.
Some techniques in software are emerging that tackle the problem in a more future-oriented way, but these are the exception. Unfortunately, the supplier community finds it easier to sell appealing dreams than to build software to deliver them.
Back in reality, where it takes months to reflect a reorganisation in the IT systems, and many months more to upgrade a core ERP system to a new version, real-time business intelligence remains a pipe dream.
What do you think?
Is your supplier delivering your software dreams? Tell us in an e-mail >> ComputerWeekly.com reserves the right to edit and publish answers on the website. Please state if your answer is not for publication.
Andy Hayler is chief strategist at datawarehousing specialist Kalido