The secret of successful desktop management is a precise and focused audit of all installations. Mark Vernon reports
The corporate desktop is spinning out of control. Knowing what PCs are where, and how better to manage resources, are among the greatest concerns of the IT department. In theory, it is simple. In practice, it becomes intractable. And frequently embarrassing. "The desktop tends to be ignored, but it is the one area that has most impact on competitiveness," says Gary Cooper, research manager at Butler Group. "All or most employees use a desktop to do their work and if it is not working properly then they are not working properly."
Ironically, it is the success of the PC that has increased the burden of management. "For years, IT staff could go through centralised purchasing, keeping a tight control on purchasing," says Peter Richardson, director of computer desktop management specialist CRC.
"As PCs became commonplace, departments' purchasing hardware and software have become segregated, and equipment and software mobile. IT managers are losing their role as the gatekeeper of PC asset inventories."
It is for this reason that desktop management begins with the audit.
The experience of Aberdeen City Council is testimony to the value of a precise and focused audit of all desktop installations. "The area around Aberdeen is predominantly rural and travelling to the 250 council offices to maintain PCs and to deploy regular software upgrades is our IT manager's nightmare," says Andrew Mein, IT project manager at Aberdeen City Council. Audit software assisted in keeping the project well ahead of schedule.
Knowing what is out there is the first step in desktop management. But it is only a start when it comes to managing the desktop asset. It is, for example, well known that a PC costs much more than the purchase price: GartnerGroup says £25,000 over five years. A software purchase can commit a company to multiple software and system upgrades, as well as additional training and support. Even if a company leases PCs, users may add software and hardware, an act that apart from possibly rendering a lease void, tampers with an asset the company does not own.
The extent of the problem is highlighted by David Rossiter (JOB TITLE???) at Peregrine Systems. He points out that most large organisations do not track assets valued at less than £1,000. "If that's the case, what chance have they got of knowing what software is installed? Or of managing the desktop?" he asks.
The situation has improved in the past 12 months. "The Y2K issue helped most IT directors in this sense because every IT asset had to be identified and fixed," continues Rossiter. "This means that many organisations, for the first time, have a complete inventory of what they own and lease. If this is maintained, it could deliver a significant business advantage."
The helpdesk is another, often hidden, cost. An average call querying the desktop lasts 17 minutes, of which nine are spent simply identifying hardware and software.
ICL is one company trying to tackle this problem. Whitbread Inns retail systems has been piloting ICL's NetDesk Online Service in 260 outlets in UK. Melissa Williams, helpdesk manager at Whitbread Inns retail systems, says, "One of the most important benefits is that we can structure our enquiries so that important details, such as parts required, are clearly included. Also, if we get new information and need to change the details of our enquiry, we can do that easily through a PC, rather than have complicated discussions over the telephone. The service saves staff a lot of time, especially time that used to be spent in telephone queues."
A further issue is that of maintaining control of the desktop. The trigger-happy user will be well-known to the IT department. Deleting or corrupting files is the common issue here, although users cannot always be blamed, since the PC might well prompt them to erase files when their memory quota is exhausted, without guidance as to which files are needed. File protection systems can prevent erroneous deletion, or alternatively re-install files automatically when problems occur. Thus reducing helpdesk calls.
Controlled environments require tight security too. The installation or removal of extraneous unauthorised programs is one issue. Protecting networked environments vulnerable to a destructive virus is another.
It is with these issues in mind that Hewlett-Packard has introduced its new e-PC. "The first model, the e-Vectra, has no disk drives and will not support a modem for direct Internet access, for example," says European PC lifecycle programme manager Nicholas Barreyre. "Smartcard facilities provide tougher user authentication security," he says. The Hewlett-Packard e-DiagTools diagnostics tool software agent runs in the background and scans the e-Vectra for problems. If this tool detects an irregularity, it immediately isolates the problem, sends Hewlett-Packard or the preferred support provider an e-mail message describing the problem and requests a replacement part.
If desktop management can ever be said to have an aesthetic solution, it is Computer Associates that provides it. Users of CA's Unicenter will be familiar with early attempts at providing this 3D network management environment, undoubtedly a substantial advance on tree diagrams and icons for certain kinds of problem.
But visualisation as a concept is central to the company's plans for improving services, not just 'eye candy', but real usability and sophistication. The point is that visualisation of information, processes and systems aids comprehension and retention, important elements for understanding and managing networks. CA's Unicenter TNG Desktop Edition is its latest solution.
With all these issues in mind, it will come as no surprise to learn that outsourcing desktop management is increasing. IDC expects the worldwide market to skyrocket to $30.7bn (£19.5bn) by 2003 as more companies turn to outside help to control network and desktop systems so they can focus on core competencies.
Stephen Ram, a consultant with TCA Consulting, believes that "Vendor costs are often much lower than internal costs, with, for example, staff costs, license costs and build times. Volume global purchasing deals can also result in major savings through economies of scale."
Build times are also reduced since build-to-order means a reduced time spent at the desktop and reduced impact on the user. The build-to-order process can also enforce application standardisation in line with strategic direction.
However, there are risks to be negotiated too. As with all outsourcing, it takes time to achieve the cost savings. The outsourcing savings should be estimated over the period of the long desktop term strategy, most commonly about five years. "The vendor needs to be carefully managed and the client needs to ensure that strong project management is in place, on both sides, to drive through both the outsourcing process and to achieve the predicted cost savings. Good visibility of the vendor's progress is paramount," Ram says.
Chris Edwards, infrastructure projects manager at Boots that recently completed a new point-of-sale (pos) desktop outsourced to Computacenter, makes an additional point. "The task goes a lot further than just supplying a pre-configured PC, because the real value to us lies in having somebody deal with all the peripheral issues that surround a new desktop," he says.
"We needed someone who could manage all the relationships both externally and internally and be capable of using their initiative to get the job completed quickly."
Which points to the most recent trend, still emerging, that is of application service providers (ASPs), services that began with rented ERP systems, but are now developing into wholly outsourced desktop services.
"This focuses entirely on the end-user requirement," says Butler Group's Cooper. "It is management at the applications level rather than treating the desktop as software or hardware. Clearly it is important that the ASP gets the right applications on the desktop for the user concerned, but the level of service is likely to be the key differentiator when this market gets going."
Case study: BG International
BG's diverse and often remote, global gas and oil operations necessitate a common, stable and reliable desktop productivity platform that is easy to configure and deploy, simple to learn, and requires very low helpdesk support to maintain.
The common desktop environment project (CoDE) was designed to provide it. Before CoDE the company was a long way from having such a well-managed desktop.
"We did not know what was out there, who had it, what software was on what PCs or servers, whether it was licensed or compliant, and what security concerns there were because of conflicts with rogue software," says operations and commercial manager Dipesh Patel.
"We knew that we must introduce standards," adds senior account manager Marion Lake. "On the one hand, the problems were as simple as opening an attached document on an e-mail. But executives wanted evidence of assets and we could say nothing for sure about the costs of desktops."
CoDE has standardised the platform on Microsoft Office 2000 and with BG's solution provider ICL, deployed a number of other tools. The GartnerGroup Total Cost of Ownership benchmark has demonstrated some startling results. For starters, deployment planning and configuration times are down by more than 80%. Also, the number of packages deployed is reduced by 50%. "The idea is that you don't give people what they don't need because if you do they will just use it," says Patel.
Helpdesk incidents are reduced by 64%. "We identified the 10 most frequently asked questions and plugged them into the customised helpdesk feature in Office 2000," says Lake. "The questions stopped being asked."
BG is looking at a number of developments for the future, which were not possible before CoDE. "With the spread of our offices and people going out to visit our assets, hotdesking is an option, with someone from Singapore downloading their files to Reading, for example," says Lake. Though technically possible, full mobility with laptops is not an option yet due to security concerns.
Windows 2000 and desktop manageability
Manageability is a watchword that has been repeated endlessly around Redmond as Microsoft has developed Windows 2000. Its predecessor, NT4, was increasingly burdened by accusations of unwieldiness, particularly when rolled out across large, complex organisations.
Many new features tackle the issue of manageability head-on. For example, Windows 2000 automatically detects that a new machine has connected to the network. It also knows the system particularities of the individual user at that desk and can reinstall all the appropriate software without further human intervention or waste of time. In short, given sufficient bandwidth across the network, the entire desktop environment can be rebuilt remotely.
Another area in which NT4 failed was its inability to treat each employee on the network uniquely. Windows 2000 will be very much more amenable to sophisticated permissions and authority structures, since the so-called granularity of administration has risen sharply. The biggest single change in Windows 2000 - the introduction of Active Directory - relates directly to this issue. Previously, information from across the enterprise detailing users, computers, printers and other system resources was stored in a flat database, a structure that cannot easily mirror the complicated reality of corporate networks. In Windows 2000, Active Directory replaces the flat database.
Desktop management and the audit A comprehensive audit of the network lies at the heart of good desktop manageability, according to Computer Research Consultants. Here are nine questions to ask when undertaking a desktop asset management audit:
What is the scope of the project?
- What does the environment look like, ie consider networked PCs, stand-alone PCs, Macintosh systems, network servers, operating systems and platforms?
- How can systems be identified? This is a very important task and should not be taken lightly. Good identification will result in easy follow-up. Whereas less than optimal identification could result in very difficult, if not impossible, tasks later.
- What kind of control is sought in the management project, ie software usage, hardware statistics, configuration data, data file location, compliance issues, spread and range of peripherals, network components?
- When should the audit be performed (in and or out of working hours)?
- Who should do the audit?
- What about software license information and what about missing information and illegal software?
- How often should the process be repeated, by whom and should it be scheduled?
- How can the audit data be maintain and by whom? Perhaps it should be outsourced?