- Thin-client computing from green screens to tablets
- Thin client challenges: Latency and energy savings
- Mitigate data security risk among mobile users
- Stream data and applications kept in the datacentre
- Server-based computing offers variety of data management methods
The concept of thin client computing goes back a long time. The first computers, using green screen monitors, were carrying out all the business logic at the centre and using the monitors as a means of presenting the results to the user. As distributed computing took over and PCs proliferated, organisations soon saw information disseminated across a multitude of devices. Through the use of centralised stores masquerading as virtual local drives, this was brought back under some degree of control.
Then mobility raised its head, with users working out of the office using laptops or home-based computers to carry out work. There seemed to be nothing for it except to provide full systems for these people, with the application and data held on the device itself. This approach is still the de facto means of providing access to personal productivity applications (such as office tools). Many enterprise applications also download data, so a specific client application can be used to work away from the office. But this means that what should just be a £700 access device suddenly becomes far more valuable and dangerous - it now contains data that may have value in itself, and can be lost through theft or carelessness.
Even in the office, having hundreds or even thousands of devices configured differently causes a management headache, and a failure in any one device can lead to considerable productivity impact while a replacement device is sourced and provisioned to the same state as the failed device.
To the rescue came thin client, or server-based computing. The most common form of this is where a user's total desktop is run on the server in the datacentre and only the graphical interface is presented to the user at their access device. This is great in theory - all data is kept central, the device type becomes immaterial, provided that it can support an access session and all control is placed back in the hands of the business via datacentre.
The problem is that the user experience is not always what is hoped for. Network latency and bandwidth can have a big impact on how the user perceives the experience, and even small issues can make regular use of such a system tiring to the point where users start to try and find ways around using such centralised systems.
Organisations also find many of the promised gains just don't materialise. Energy savings based on replacing desktop PCs with low-energy thin clients don't always add up; after all, a PC is just as useful as a thin client as an access-only device, and many organisations carry on using these relatively energy-hungry devices and actually see their overall energy usage increase.
So, are completely virtualised desktops (virtual desktop infrastructure, or VDI) the way forward? Certainly, when looking at task workers who are tied to a specific desktop device it can work well. LAN speeds mean response times are adequate, the VDI images can be defined and managed centrally to serve hundreds or thousands of users and low-energy access devices can be rolled out as and when required.
How about the more mobile user? A completely centralised system may not be the right approach for them. However, these are the users who present a greater security risk, so providing them with a completely mobile-based system introduces too much risk. Recent research carried out by Quocirca for Trend Micro demonstrates the increasing consumerisation of IT means more people will be coming to IT with non-standard devices, expecting them to be able to be used for accessing corporate systems.
The same research found growing use of tablet devices in the more remote workforces. The key here is to abstract the device from the function. It is possible to maintain a fully centralised image in the same way as a standard VDI approach, but to copy that image to the device. This can be held in a virtualised space, so there is no interaction possible between the device, its basic operating system and applications and the corporate image running in its own secure environment or "sandbox".
Such systems do still present a degree of security risk - data is still being stored within this virtual image - but this can be mitigated through flushing any data down to a central store as soon as network connectivity is made, or ensuring data is deleted if anyone attempts to access the sandboxed environment via non-preferred means. For those who spend a great deal of time disconnected, this provides the best balance between centralised control and capability to carry out their corporate function.
It is further possible to stream applications down to a device on the fly. Here, the intelligence of the end device (its CPU, operating system and so on) is still used to run the application, but the application is kept updated in the datacentre. On logging out, any data created or changed is flushed back to the server, and the associated data footprint is wiped completely from the device, removing security concerns.
This approach is very useful for those who will be connected to the network for most of the time, but may be accessing systems over a low-bandwidth or high-latency connection. The initial application stream should be provided over a high-bandwidth connection, but from there onward, only the delta changes need to be passed down and use very little bandwidth. Data access times can be improved through the use of wide area network acceleration tools, (such as Expand Networks, Riverbed, Blue Coat or Silverpeak), so providing a very fast experience for the user.
There are many ways of providing a virtualised environment that improves security, ensures data is centralised and improves the capability to manage a disparate, decentralised and increasingly consumerised, user-device environment. Indeed, just centralising data can solve some issues, with read-only application clients and restricted capabilities such as cut and paste enabling users to continue working without the a complex approach to virtualised desktops and/or applications.
Along with the main players of Citrix and VMware, smaller players such as Centrix, RES Software and AppSense provide tools and systems to manage the user experience through a seamless aggregation of a hybrid system, where parts of the virtual system will be based around VDI, some around desktop streaming and some around application streaming. Likewise, those who have emerged from a hardware world, such as Igel, Wyse and ChipPC, offer software that makes the most of these increasingly intelligent devices, enabling users to gain the most from a more centralised system.
The answer may not always be VDI - but it is increasingly apparent that it will be server-based computing to provide the information control that today's organisations are striving to find.
This was first published in June 2011