Utility computing is on its way, but not until 2006, says Rakesh Kumar.
During the next six years, stricter financial controls and tougher end-user requirements will mean IT departments must find innovative ways of procuring new technology, balancing just enough infrastructure with appropriate payment methods.
The IT department must balance what users need and how they should be charged with what it delivers to them and how it pays suppliers.
Utility-based computing enables the IT department to behave like a utility middleman to users by procuring technology in a fluid manner, aggregating and packaging the services, and charging on a usage basis.
Delivering on this utility and grid data centre vision is the cornerstone of the server strategies for major hardware suppliers such as IBM, Sun, Hewlett-Packard and Dell.
However, while utility computing models provide a compelling vision, they will not become real until 2006 and beyond.
Analyst firm Meta Group believes fundamental weaknesses will remain for the next two or three years such as forward pricing, software pricing and price/performance improvements.
The organisational processes that will enable users to handle the models are not yet in place, for example, granular accounting/reporting, project cost allocation and operational processes.
Users could find they have inappropriate contracts, paying too much for the wrong technology and not being able to account for all costs.
Initial evidence suggests that more than 80% of global organisations with sophisticated accounting tools will struggle to move from an annual IT budget/project-based model to a cost apportionment model. The fluid, utility-driven environment features irregular cash flow, granular payment schedules, and complex internal cross-charging.
Because utility computing models will not arrive until 2006 or later, suppliers will attempt to repackage existing products, such as partial outsourcing and buy-lease models, as utility packages.
Users will be promised instant power, seamless integration and access to new technologies. Storage and CPU/server products and fully loaded data centres will augment business process design and application integration services, giving an oversimplified view of what is achievable. They will have to evaluate the financial return on such contracts and the impact on internal processes.
Fundamental technical issues in the utility computing model will be camouflaged, particularly in technology integration with existing systems; virtualisation; application integration, and resource management.
IT architecture and infrastructure groups must evaluate the technical limitations and benefits of the models on offer, and operations groups must factor in the cost of process redesign to gain benefits.
Through 2006 more than 60% of IT departments will adopt a cautious approach to utility computing. The remainder will embark on ambitious programmes, resulting in expensive undelivered projects.
What do you think?
Will your IT department make the move to utility computing? Tell us in an e-mail >> ComputerWeekly.com reserves the right to edit and publish answers on the website. Please state if your answer is not for publication.
Rakesh Kumar is vice-president at Meta Group