Dale Vile, Contributor
I remember reading an analyst prediction at the end of 2008 that 2009 was the year in which desktop virtualisation was going to go mainstream. I challenged this at the time, on the basis that we had recently completed a number of studies at Freeform Dynamics suggesting that IT pros in the mainstream weren't even clear on what terms like desktop virtualisation and virtual desktop infrastructure (VDI) actually meant, let alone why they should be investing in this area.
As it turns out, things didn't take off in any big way in 2009. We simply saw an ongoing creep of early adopter activity, as anticipated. The good news is, though, that the past year has seen quite a bit of real world experience being gained, and both awareness and skills have been building.
While we are still in the early market, it's clear that some of the technologies and specific offerings have reached the status of "mainstream readiness," and there are enough examples of deploying in various real world scenarios to provide a degree of confidence about what can be expected. This all bodes well for the future.
Confusion confounding the desktop virtualisation market
The biggest challenge at the moment is confusion. Just like in other areas, such as cloud computing, the same or similar terminology is routinely being used to describe some quite different things.
I am not going to go into the different flavours of desktop virtualisation here, but if you want a very accessible overview and explanation of the main options, see the paper Tony Lock and I put together last year entitled Evolution of Dynamic IT. This was originally intended as a primer for IT pros working in medium-sized businesses, but the walkthrough of various desktop delivery architectures is pretty generic.
In the meantime, while you won't catch me making simplistic predictions about the market, I would say that 2010 is not a bad time for many businesses to start getting up to speed on desktop virtualisation and VDI if they haven't done already. Here's why:
- The desktop is integral to IT service delivery. It is the most common point of access for users, and as such, how well it performs has a disproportionately high bearing on user satisfaction and the perception of IT, even without considering productivity impacts. Beyond this, we know from research that allowing the desktop environment to drift and become too out of date has significant cost and risk implications, as well as representing a huge distraction to IT staff. Whichever way you look at it, the way in which the desktop is enabled is something to be taken very seriously.
- From a timing perspective, a combination of the economic downturn and the negative reception to Windows Vista led many organisations to put their normal desktop modernisation and refresh cycles on hold, often for a year or more. As economic conditions improve, however, and the positive response to Windows 7 neutralises the Vista effect, it's only natural that desktop virtualisation and modernisation will find itself back on the agenda. With a whole bunch of new delivery solutions that have been knocking around the edges of IT for a while becoming "mainstream ready," as mentioned earlier, there are now quite a few viable alternatives to simply moving forward with the next iteration of the Windows 'fat client' desktop.
If this line of thinking makes sense to you and you do start to investigate alternative options, the one piece of advice I will leave you with is not to assume that a single approach, whether the traditional desktop or a virtualised model, will be appropriate to solve all of your business and end user needs.
Experience tells us that most organisations are probably best served by a blend of desktop delivery mechanisms, so as well as getting up to speed on the technology, it is also important to analyse and segment your users and make sure you fully understand their needs and constraints.
Dale Vile is founder and research director of analyst firm Freeform Dynamics and a Contributor to SearchVirtualDataCentre.co.uk.