Thirty-three per cent of organisations are unable to calculate the financial cost of their IT systems, according to a report from research group Vanson Bourne.
The survey of 113 senior business executives and 170 IT professionals was commissioned by business service management firm Managed Objects. It found that 53% of respondents lacked accurate cost data, and 48% were unable to allocate costs by service.
Some 35% of business managers said their current method of measuring the cost of IT applications was not very accurate.
Although 55% of IT managers said they did a sufficiently effective job of controlling costs, 53% of business managers said that IT did a somewhat effective job but could use improvement. Only 7% of the overall sample agreed that IT did an extremely effective job of controlling IT costs.
More granularity in the measurement of IT costs would yield improved decision making regarding IT cost management, said 60%, as well as better business alignment of IT spend to corporate goals, according to 54% of respondents.
The problem with measuring IT costs becomes apparent when users try to cost any form of shared service, web service or even a shared IP-based network.
Will Cappelli, research vice-president at Gartner, said, "There is a huge gap between the way IT is bought and consumed by organisations."
In Cappelli's experience the financial model used to measure return on investment has been too simplistic: the business has generally focused on the cost of the IT asset, without linking this to any improvement in business. What this has meant, according to Cappelli, is that users found it difficult to measure the cost benefit of buying additional functionality.
Sean Larner, managing director, Europe, at Managed Objects, said that given the industry average cost of a server is £40,000, "Businesses want a means to measure return on investment.