The need to test packaged applications on virtualised systems could hold up the deployment of server virtualisation software, according to attendees at Gartner's datacentre conference in Las Vegas this week.
Gartner analysts and IT managers at the conference said the issue could strain relationships between users and their software suppliers if suppliers proved reluctant to troubleshoot their applications on servers running virtualisation software, which is expected to play a big role in the next few years in raising CPU usage rates on x86-based servers.
Tony Fernandes, vice-president of technology infrastructure at Inventure Solutions, the internal IT arm of Vancouver City Savings Credit Union, plans to begin testing Microsoft's Virtual Server software next year.
He said he expected to have to train his staff to perform some application troubleshooting tasks, and thought the process would prove whether his suppliers lived up to their partnering claims. "Partner is this great word, but how many spell it correctly?" he asked.
Users have two strategies for overcoming supplier resistance to testing. One involves training internal IT staffers to do the necessary work on virtualised servers. The other is to threaten to take their business elsewhere. Fernandes said that, if necessary, he could find an alternative supplier for 95% of the applications at his company.
Conference attendees said troubleshooting software was most likely to come from point-solution suppliers that developed specialised applications, often for specific vertical industries.
But many such suppliers are relatively small and lack the funding and expertise to test their software in virtualised environments, according to William Miller, manager of computing services at medical equipment maker Roche Diagnostics. He plans to conduct his own tests of third-party software on virtualised servers and then seek help from suppliers if problems emerged.
Application support by third-party suppliers is "the main issue" in adopting virtualisation technology, according to Luis Franco, vice-president of technology at Banesco Bank. Some suppliers "don't want to assure the quality of their applications" on virtualised servers, he said, adding that his bank needed to increase the skills of its own personnel as a result.
But in the long run, application suppliers may have little choice other than to make the adoption of server virtualisation software as easy as possible for users.
Gartner analyst Tom Bittman predicted that the average CPU utilisation rate on two-way Wintel servers would increase to 40% by 2008, up from about 25% now. He said the rise would be partly driven by an increase in virtualisation products, including Microsoft's Virtual Server.
Bittman said some users believed that x86-based servers were so cheap there was no point in buying virtualisation software for them, but argued that users might be spending more on x86-based servers as a whole than on mainframes or Unix systems.
He also pointed out that low-end servers generated a lot of heat because of their increasing CPU power and density, contributing to cooling problems in many datacentres.
Dave Mahaffey, technical systems administrator at the Santa Clara Valley Water District, agreed. "We've got a server for every application and it's getting out of hand," he said, adding that he wanted to consolidate his servers and increase CPU usage rates from a current 15-20% to as much as 50%.
Patrick Thibodeau writes for Computeworld