News Analysis

VMworld 2010: Elephants in the server room – challenges to virtualisation

Ian Grant

To hear the sales guys talk one would believe there is no limit to the cloud's applicability to information processing. There is certainly no limit to their ambition, but VMworld 2010 in Copenhagen revealed some areas that have been glossed over.

A big one is the legacy issue. Virtualisation, at least in the VMware world, depends on systems running on x86 technology. Yet there are thousands of applications that are not written for Wintel or Linux platforms, that are still in daily use in banks, manufacturing companies, retailers, utilities, and science research organisations.

So far, there is no way to uplift such applications and resite them in a virtualised environment, and there appear to be no plans to do it. David Wright, VMware's head of technology for EMEA, says unless it is on x86 technology, there is not much they can do.

Of course, these ageing systems have their own lifecycle. Even though many are mission-critical, people with the skills needed to maintain them, such as Cobol and Fortran programming, are retiring, dying or leaving the industry.

Firms that have stuck with their legacy systems because they are mission-critical face a dilemma: should they redevelop the application for x86 platforms, with all the business risks that systems development entails, or should they persevere, training young people in old technologies, and risking system collapse because of the mechanical failure of an irreplaceable hardware component?

Clearly the answer must be to redevelop the system, but to do so in the light of the best available modern practices and opportunities. Of course, accounts must still balance, the data must be secure, and garbage in still means garbage out.

There are plenty of new high-level programming tools that make systems development quicker and more secure. There are also plenty of tools that will check code for logical consistency and security threats such as buffer overflows. And there are tools to help companies retain their discretion about where to site the processing and the data, in-house, on a private or a public cloud, or some combination of them all.

Many firms with legacy systems will have gone through the decision tree when client-server systems first appeared. What has changed since then is the commoditisation of hardware and the availability of robust cheap or free applications from the open source community. Frequently these are more than adequate for general purpose office work such as documents and spreadsheets, and often for mission-critical systems such as website content management.

But lower costs lead to "server sprawl", inefficient purchasing, administrative complexity and lower returns on investment. Virtualisation aims to reduce that.

Cost savings

The headline advantage of the virtualised environment is the immediate cost-saving in capital expenditure, says Mark Newton, VMware's head of marketing for EMEA. Typical savings are 80% on hardware and power usage, and 30% to 50% on software, he says.

But there is another opportunity to revamp the way a company does IT. Traditionally, a user had to make a business case for a system. Then the company procured it, and the cost was absorbed or charged back to the user.

According to Newton, only around a quarter of the firms he knows are disciplined enough to use charge-backs. There are two reasons to use charge-back. One is to alert users to the true cost of their information processing, the other is to facilitate a pay-as-you-go system that in the end allows users to choose where they want to process their information.

Market prices can be misleading. Amazon and Google make their money from retails sales and serving advertisements respectively. The prices they quote for storage and processing are more likely to reflect the marginal costs of the extra processing clients bring rather than full cost recovery.

This means business managers may not properly account for all the costs of security, administration and infrastructure an in-house IT system must recover. "But then life isn't fair," Newton says.

The important point is that education is essential on both sides, and it is as much about what data is vital to the organisation and how to maximise its value, as it is about the cost of processing it.

Procurement

Another area for CIOs to look at is to adjust their infrastructure procurement processes to take advantage of the increasing number of options that virtualisation and cloud computing offer. "We are now in a position to deliver against that increased elasticity," Newton says.

Whereas the make/buy decision used to be fairly clear-cut, the options now include renting and free. Each has its own risk-reward scenario that needs to be evaluated, adding to the complexity of the CIO's job.

As a result, the role of the CIO is changing.

In Wright's view, the IT function has three levels of responsibility. One is to provide the hardware, software and communications infrastructure using a combination of purchased or rented resources. The next is to manage the applications, system development and interfaces to user devices, and the third is the user devices themselves.

Some would regard the first two layers as a return to the good old days when users were kept outside the computer room. What is different now is that the CIO is more like a maitre d'hotel, providing menus of applications and devices to delight his discerning patrons, and hiding the hot steamy work in the kitchen.

Whether that is a promotion in the ranks remains to be seen.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy