It's difficult to read anything these days on what's happening in the world of ITC without tripping on a virtual this and a virtual that. Indeed, Quocirca has researched and reported on virtuality for some years now, and identified many of the problems that users are now finding as they make their way through the virtual minefield, writes Clive Longbottom, service director at analyst Quocirca .
One of the issues that comes up with virtuality is how to manage the software that is needed within such virtual environments. Initially, this was done through the use of "golden images", where all the component parts of an application or service (operating system, application server, the application itself, plus any device drivers and so on) are built and stored ready for use as a single, complete image. As time goes by, a library of these images grows, and some form of library management is required. This library manager not only has to be able to identify all the images that are available, but also has to understand all the versions of all the component parts, so that it knows which images need patching or upgrading when a new patch comes through.
If you only have a few images, this is not a major problem. However, many organisations suddenly find themselves with thousands of images, many of which have the same operating system underlying the image. In comes a patch, and several hundred images suddenly need patching. Also, each of these images is carrying gigabytes of the same information within it, which is wasteful in terms of storage space.
Quocirca has written before about how Microsoft is bringing a different approach to the market with its Project Oslo procedural engine, which will enable images to be built on the fly. However, it still builds each image from complete stacks of components, which in itself can be slow and wasteful.
A US-based company, FastScale, is taking a different approach. While still based on just-in-time construction of images based on a procedural model, it takes an Occam's Razor view of how this should be done. For example, the vast majority of operating systems have now grown up as bloatware, carrying around heaps of excess data, information and services that are not required for a specific task. For example, the "inf" folder in my (somewhat battered) implementation of Windows Server 2003 contains over 150MB of "stuff" - some of which may be needed, some of which may not. Similarly, the operating system starts up all sorts of services on booting up - just in case.
FastScale takes a thin approach - it holds a database of required components, and comes provided with a set of rules that contain the contextual dependencies between most enterprise applications and the operating system that will need to sit underneath them. In this manner, it can ensure that only the bits of the operating system and application that are required are put into the image. Not only does this mean that you end up with an image that can be put together far faster, but it is also highly tuned, and so runs faster. Also, as it is a slimmer, more efficient image, you can place more of these images onto an equivalent platform than you could with full-fat golden images.
FastScale also shares the benefits of any procedural approach to building images on the fly - any patches or upgrades are applied to the image held in the database - and even then, with FastScale these are only "applied" virtually. Therefore, any applications that are incompatible with the patch or upgrade can still load up the original base image and run without any issues. This also enables organisations to apply patches and upgrades when they have done any retro testing - and the use of thin, efficient virtual images can even speed this up.
We think FastScale's technology is pretty impressive, but it has the likes of Microsoft, Oracle and Procession hot on its heels. It is only a small organisation, but already has some large customers. However, FastScale could end up going the way of another interesting virtual technology company - PlateSpin, which was acquired by Novell, and so lost a lot the independence that had been associated with it.
Also, FastScale is currently only focused on server computing - which is a pity. Its technology could revolutionise thin-client computing, as it could improve image densities by three or four times. But, this would require a concentration of effort from its R&D group that could run the risk of defocusing the company completely.
However, for those looking at large virtual ITC estates, FastScale looks like it could simplify the management of images, while also speeding up provisioning and improving image densities. As far as Quocirca is concerned, it is a company well worth looking at.