The IT virtualisation product market is expanding rapidly in both reach and range. On the client side, established desktop and application virtualisation products are being joined by new offerings that are pushing the boundaries of what can be achieved in this space.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Storage virtualisation is another market reaching maturity, with standards agreed and leaders identified. However, the server virtualisation market is still in a state of flux, even with its well-known suppliers.
Here, interoperability may be a major hurdle to overcome for early adopters. Moving from physical to virtual might appear to be straightforward on paper or in the lab, but the migration of live applications into a virtual environment presents a very real set of challenges.
Virtualisation now operates across a much wider spectrum than just x86 server virtualisation. It encompasses networks, storage, applications, desktop and servers. By combining these technologies, organisations can realise significant benefits, and transform their IT infrastructure from a static to a dynamic resource that can help enable the business to meet its strategic goals.
Beyond server consolidation
This new approach extends beyond just thinking about the traditional view of IT virtualisation - meaning server consolidation - and it is increasingly being seen by CIOs as a solution to breaking the rigid links between applications, hardware, platforms and middleware, and end-users. Using this approach, the IT department is able to meet the demand that the business is placing upon it.
Many organisations initially adopt IT virtualisation in order to save costs through server consolidation, but soon realise that other benefits are possible when operating with a virtualised infrastructure, and so IT virtualisation then becomes a strategic part of the overall IT plan.
Server virtualisation technologies operate by using one of three basic methods:
● Emulation, where one resource imitates another resource.
● Partitioning, where a large resource appears as many smaller resources.
● Clustering, where many resources appear as one large resource.
The use of these capabilities and technologies has developed widely different applications, and many organisations will have different examples of these technologies within datacentres.
Extract maximum benefit
Understanding these capabilities and how their usage differs is key to organisations extracting the maximum benefit from its server infrastructure. The low and mid-range server market is the area that receives most coverage in terms of the different techniques used in server virtualisation.
Emulation is the approach taken because the IA-32 instruction set used in the x86 architecture was not designed for virtualisation. There are two principal approaches to virtualising: hypervisors - "bare metal" virtualisation - and hosted virtualisation, where the guest operating system sits atop an existing operating system.
Partitioning is a proprietary technology to allow enterprise-class servers can be transformed from large, monolithic computing engines to multiple, smaller resources, thereby increasing utilisation figures and enabling organisations to obtain better returns on their IT investments.
Clustering technologies can be used for a number of different purposes depending on the organisation's exact requirements. The primary reason behind the deployment of clusters is to provide high availability of systems, but there are also other reasons.
These include network load balancing, which enables scaling for web applications high-performance clustering, which was traditionally used in scientific and research scenarios, but is increasingly becoming accepted in the commercial market and internal grid computing, which uses the organisation's existing resources to perform many parallel actions.
Utilising storage virtualisation
Storage virtualisation helps organisations manage their storage resources more efficiently in order to achieve higher utilisation rates.
It was developed predominantly to work in storage area network (San) environments, where the high costs of implementing a San in the early days of this technology were compounded by poor utilisation rates.
The technology has developed rapidly since it was first introduced by early adopters, and it now offers organisations much more flexibility, as it allows them to use virtualisation as part of their information life-cycle management strategy.
Desktop virtualisation is an area that has witnessed significant advances in recent years, evolving from the thin-client technologies first introduced in the late 1990s to the server-hosted virtual desktop infrastructure (VDI) systems of today. These systems provide most of the benefits of a traditional PC, but with greatly reduced management needs and total cost of ownership.
VDI will appeal greatly to those organisations seeking a more manageable yet flexible desktop environment. However, the removal of desktops and laptop computers requires a significant cultural change, and so it will take some time before VDI becomes pervasive.
● Roy Illsley is a senior analyst at research firm Butler Group