Based on the media headlines, the activity from vendors and the advice from analysts, you’d be forgiven for thinking virtualisation is a done deal.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
But research from Quocirca – carried out as two cycles of interviews for Oracle in February and November 2011 – shows this is not the case.
In the first round of research, 25% of respondents had less than 10% of their server hardware virtualised and only 30% had more than half their hardware virtualised.
Ten months on and it is apparent here is a lot more virtualisation taking place as the number of respondents stating less than 10% of servers being virtualised dropped to 13% - but those with more than 50% virtualised was still only 33%. Virtualisation is still not moving at the speed many would like to believe.
Why don't more organisations virtualise applications?
For many, virtualisation sounds too good to be true, with its promise of consolidation down to levels of around 20% of existing hardware and consequent reduction in energy bills, need for fewer system administrators, licensing fee savings and reduced datacentre space. But where some low-end, non-mission critical services can be easily moved to a virtualised environment, many organisations have baulked at the prospect of moving enterprise applications such as SAP or Oracle onto virtual hardware – either due to the perceived complexity and possible impact on the organisation, or purely from a feeling of needing to wait until the process is proven elsewhere.
Others are waiting for the financial situation to sort itself a little. Although virtualisation promises ongoing savings, there are pretty large upfront costs involved in carrying out a full audit of what is already there, planning for broad virtualisation adoption and carrying out the project. Many just prefer to wait, as existing systems are working as far as the organisation is concerned.
Others are waiting for the technical side of things to stabilise. After a period of whatever year it happened to be being “the year of virtualisation”, everything now seems to be going towards “year of the cloud”. As virtualisation is a key underpinning for cloud, many have chosen to wait until they have seen successful implementations of cloud elsewhere, when the concept becomes more real to them, before embarking on full virtualisation. There is also the need to fully understand how the different flavours of cloud can be used – and so just where virtualisation fits in the private/public cloud mix.
However, there are drivers beyond datacentre efficiency that means virtualisation has to be looked at as being part and parcel of the future for any organisation. One is the explosion in different types and sizes of user end points, combined with the consumerisation of these devices finding their way into the organisation. This has led to a need to control device usage. It has been shown, many times over, that trying to dictate what types of devices users can and can not use doesn’t work, so many are now looking to run virtual desktops situated in a controlled datacentre, where an access device can be used to allow an end-user to work without the risks of uncontrolled and unmanaged devices wreaking havoc and without device loss or theft becoming a major issue.
Another driver is the increasing number of software packages presented as “virtual appliances” that are easier to set up and run than traditional install kits. Such appliances have become commonplace for IT security and are increasingly seen in other areas such as e-mail and databases. Virtual appliances have the benefit of being easily deployable while coping with peaks and troughs in workload. In a well-implemented virtualised environment, resources can be borrowed as required and new appliances invoked as necessary.
However, the biggest driver for server virtualisation will be cloud computing – despite all the current misunderstandings and misperceptions. At the moment, respondents to Quocirca's survey were still not sure about cloud, but there has been a big change in how cloud is viewed. In the first cycle of the research, 13% stated that cloud had no part in the future of their organisation, falling to 6% ten months on. Those seeing cloud as a complete game changer had risen from 12% to 21%. This growing interest in cloud will undoubtedly drive the implementation of virtualisation to support cloud, even if 2012 is not quite the “year of the cloud” many hope for.
Cloud will also drive other forms of virtualisation, as cloud computing depends not just on servers. Storage virtualisation is coming of age, with different approaches becoming apparent from the likes of EMC, HDS, NetApp and Dell. Relatively new vendors such as Coraid, Fusion-io and Egenera are making a play to become cloud-specialist storage vendors, or at least able to provide differentiated cloud storage plays to plug holes in existing storage vendors’ portfolios. The need for different types of data to be stored on different types of device in different ways, but with full access across all the data (rather than each storage device being a data silo) so big data can be embraced, is driving an increasing need for virtualisation at the storage layer.
Similarly, network virtualisation – using devices such as Cisco’s Nexus, IBM’s Blade Systems and Dell Force 10 top of rack (ToR) highly virtualised switches – enables networks to be optimised for different traffic types while flattening network fabrics and reducing latency introduced through multi-level hierarchical network approaches. In conjunction with storage, the network is struggling with high growth, driven not only by new data types such as voice and video, but also the need to be able to prioritise streams in real time to meet the needs of the business.
What this really means is that the future for virtualisation is good – but it remains complex at a component level and will require better messaging and education from vendors and influencers to help organisations move towards the promised land in a fully managed manner, where the business and cost benefits are recognisable.
This will require all the areas of a virtual environment to be considered in the round – just talking about one type of virtualisation in the absence of others will not help organisations get the message. To this end, there has been a noticeable growth in preconfigured systems from vendors. IBM, HP, Oracle, SGI, Dell and others are already providing fully-engineered modules for building highly virtualised environments. These modules can be anything, from a single equipment rack pre-configured with servers, storage and network equipment in an optimised manner, through full datacentre rows to complete systems held within standard shipping containers. By taking such an approach, the complexities and possible pitfalls of a build-it-yourself system are avoided.
For most, the future of virtualisation will be a hybrid approach – some older applications remaining on dedicated physical servers, due to issues with moving them to virtualised environments; some applications on dedicated but virtualised platforms where higher levels of availability can be guaranteed along with better overall system utilisation; certain applications, services and functions being provided from highly virtualised private clouds and other services and functions being served from external public cloud systems.
However, it does mean that organisations should ensure they understand at least the basics of the different approaches to virtualisation, the different types of workload (application, service and function) that they could be looking at putting into virtual environments – and what cloud means to their organisation.