Server virtualization may have created ripples in Indian, enterprises but there still remains a definite need to analyze it critically before deployment. As Galen Schreck, the principal analyst of Forrester Research rightly puts it, "Getting the most out of server virtualization requires firms to overcome cultural divides that exist within even the most functional of IT departments."
Adopting server virtualization requires a phase-wise approach which begins with planning the need for virtual servers, the best approaches, needs of host servers, and then managing them. "It depends on the goal you have about where you want to take your current IT infrastructure. Only then can you chart out a plan. It is important that the organization know the architecture and the data center environment," says Saji Thoppil, the general manager of Wipro Infotech's platform practice. Wipro Infotech has been experimenting with server virtualization for the past two years. Server virtualization has been in use for five years at Wipro Infotech's outsourced development center.
According to Ravindra Ranade, the head of presales and global consulting for Red Hat India, "While determining the requirements of host servers, one should look at the business applications' utilization, requirements in terms of data, and the applications which demand peak-load." You should segregate these applications and after that run a tool to determine the load demanded by an application. After that, you can decide as to how much server virtualization and the number of hosts the data center needs.
Organizations can always work with consultants for capacity planning to determine the physical server's architecture and utilization, along with the traffic pattern. Zoeb Adenwala, the global CIO of Essel Propack, says, "The hardware should be scalable. If your organization has an enterprise systems management tool, then you can dig out the historical trends from that, and look at determining architecture for the host servers."
Shamrao Vithal Co-operative Bank Limited (SVC), has opted for server virtualization in the process of upgrading its data center. To evaluate and determine the needs of its host server, SVC ran a tool (from C-DAC) on its servers which showed the level of utilization of the processor, power and hard disks. For such exercises, there are three automated tools available in the market at the moment. Once you run the tool, the output is directly uploaded in the software. Based on this output, the principal company decides the ideal hardware.
Deciding on the platform
In an era of x86 platforms, there are certain technologies which determine the success of server virtualization. Earlier, companies chose x86 as the hardware platform since it allowed horizontal virtualization.
According to Thoppil, "The CPU in the machine is the key. Some of the newer versions of server virtualization software come with specialized acceleration at the CPU level. Therefore, many instructions are offloaded by the virtualization layer or hypervisor at the processor level. Since there are multiple processors available at present, one has to look for the right one."
Server virtualization solutions which offer CPU level acceleration are available from various OEMs. Predominant players on this front are from vendors such as VMware, Citrix and Microsoft. However, RISC server vendors offer in-built server virtualization like SUN (on the SPARC platform), HP (on the PA-RISC platform) and IBM (LPAR).
"For determining the ideal platform for x86, blade servers are ideally suited for setting up virtual server farms, as they can give good power, scalability and occupy less space," says Adenwala.
An organization may decide for a conservative or aggressive approach, but should always ensure careful capacity planning. Thoppil says, "Enterprises tend to ignore factors like storage technology — especially the storage platform. Sometimes you also have to bring in networking components, which supplement the virtualized platform."
In critical high availability environments, where servers are in high-availability mode or active mode, they are connected to the storage. Now the same storage scenario needs to be achieved in the virtual server world.
There will always be applications with compatibility issues. For instance, Oracle does not support server virtualization on a non-Oracle platform.
Adenwala says, "Each layer right from computing to storage to network has to be architected properly. One has to consider competency of the implementation partner, exact determination of server loads, and compatibility of applications to be run on the virtual server environment."
Another major problem is support from the vendor. According to Thoppil, "Most of the server virtualization vendors operate with small partners. The big vendors will provide good presales support, but the actual sale is done by a small vendor."
And what about redundancy? Virtual environments typically have features to automatically protect you from server hardware failure, as the application is not tied to any hardware. If any physical server fails, the server virtualization software will move the applications to a working server.
Adopting server virtualization across the enterprise should naturally follow a maturing curve. Usually, IT organizations first focus on non-business critical production workloads before venturing into production environments.
As their processes become mature and expertise improves, a more proactive strategy that includes business-critical applications makes sense for consolidation. Again it is better to virtualize applications which may need more CPU power in the near future first, so that they can be scaled up easily.