White Paper: Thin client/server computing lets you take control

Thin client/server computing and Windows, together, form one of the lowest total cost of ownership solutions for enterprise users...

Thin client/server computing and Windows, together, form one of the lowest total cost of ownership solutions for enterprise users wanting to utilise email and the Web

A closer look

Thin client/server computing is a server-based approach to delivering applications to end-users. In this model, IT organisations deploy, manage and support the company's applications on a server. This architecture allows robust, secure access to any application across virtually any network, and allows an organisation to simplify IT administration and gain tremendous cost savings in user support and in the deployment and management of their applications.

For decades, mainframe systems delivered applications to users through "dumb" text terminals. Performance was adequate for the functions available, security was tight, and the complexities of managing databases and other IT administrative tasks were almost invisible to the average user. Despite these successes, mainframes fell far short in cost-effectiveness and the ability to display Windows-based applications. These limitations became more glaring as the pace of business accelerated. Client/server computing was the answer. It allowed companies to introduce new and updated functionality and applications quickly, could distribute more information to individual users, gave users a graphic interface, and offered cost-effective scalability to keep up with business growth.

In addition, client/server computing introduced democracy to computing by putting on users' desktops a complete system that could store applications and megabytes of data locally. This freed users to do large amounts of local processing, and knowledge workers enthusiastically tapped the PC's power for word processing, spreadsheet analyses, design, data analysis, and much more. Although a solid success, this computing model did present some issues: companies began to lose control of their data, many users did not necessarily need the power of the PC and struggled with its complexity. Software updates and upgrades were often messy and expensive, and the total-cost-of-ownership was surprisingly high.

In response, companies began to simplify their server infrastructure by creating a form of centralised client/server computing that drew their servers into a central location. This centralisation enabled companies to pull application and database servers into a more secure, more manageable environment, but it did nothing to help the end-user nor did it assure data security or resolve system administration issues.

The rise of thin client/server computing

Clearly the next step is to simplify the model of computing. Thin client/server computing is the evolutionary model that retains the benefits of client/server computing, but lowers the cost-of-ownership through simplification for end-users and IT departments. It helps individuals not requiring the expandability of PCs, and it helps system administrators simplify the task of managing the total environment.

Thin client/server computing allows IT departments to establish direct control over the essential parts of their IT environment. For example, applications and data are supported and managed at the server in thin client/server computing. That allows IT departments to update and upgrade software, perform data backups, test and deploy applications, and other data centre activities in an efficient manner uniformly and securely across the environment, all without disrupting end-users.

Who should use thin client/server computing?

Thin client/server computing is ideally suited for "task-oriented" end-users. This global term cuts a cross-section to encompass people in manufacturing, clerical, customer service, and some data analysis, technical and professional positions. The common denominator is that "task-oriented" end-users spend most of their time using a limited number of applications to perform data entry and lookup. They may use word processing and similar applications, but they primarily focus on accessing line-of-business applications. One should not fall into the trap of assuming this definition only applies to "simple" applications. In fact, they encompass such complex environments as Enterprise Resource Planning (ERP) systems and applications that access numerous databases.

A partial list of "task-oriented" end-users would incorporate such positions as:

  • reservation clerks for hotels, airlines, car rental firms etc
  • help desk operators
  • data entry/data lookup points in hospitals and nursing homes
  • retail point-of-sale
  • manufacturing shop floor workers
  • bank tellers
  • stockroom and shipping clerks
  • order entry clerks
  • stock-brokers
  • estate agents
  • insurance agents

A thin client is a desktop device that is focussed on delivering the lowest cost, most secure, and most manageable graphic client solution. Recent years have seen the emergence of three thin client alternatives: the NetPC, network computers (NC), and Windows-based terminals. They differ in the amount of local application processing each support, the capabilities of the operating systems, and the disk drive configurations possible. Most importantly, they differ in their philosophical approach to simplification and cost containment.

With NetPCs, processing occurs on the desktop, not on the server, making it a PC with centralised management capabilities. NetPCs are sealed desktop PCs with no user-accessible input/output devices (no floppy drive and no CD-ROM). Administrative functionality varies, but usually includes extensive capabilities to monitor and update a user's software and system configuration. Because NetPCs are sealed, users cannot introduce viruses, corrupt data or add new circuit boards by actions at the desktop, so data security is high and system administration is much easier than with standard PCs.

In concept, Java-based Network Computers (NCs) rely heavily on downloading small components of the applications upon user demand for local processing. NCs have many of the virtues of thin client/server computing: they are simple, low-cost devices, and they are easy for system administrators to manage from a central point. However, they depend on a still-developing technology, and implementing the architecture requires large up-front investments in designing, testing, and deploying applications as Java-based components.

Windows-based terminals are the "thinnest" of all of the thin client devices and hence offer the lowest cost and best manageability of all of the solutions. They rely on a 100 per cent server-based model of computing, where all application processing and storage is conducted at the server. Windows-based terminals are essentially intelligent display devices that present a graphical user interface (GUI) to the user. They pass all user-keystrokes and mouse clicks directly to the server for processing. The server, which runs a separate session for each user, receives and interprets the keystrokes, then performs all application processing including retrieving and updating data. The results go back to the Windows-based terminal for display resulting in an extremely simple and efficient process.

Thin client/server computing with Windows-based terminals: the next step forward

Windows-based terminals are low -cost, graphical desktop devices that can display applications from a Windows NT server or can access UNIX and legacy applications through emulation software. They have no local operating system beyond a small one that is preloaded on internal firmware. This OS communicates with a server running multi-user software such as the recently released Windows NT Server, Terminal Server Edition or Citrix WinFrame (for Windows NT 3.51). They do not support a hard drive, but will allow users to attach a floppy drive or a local printer to serial and parallel ports on the backside.

The server (or network of servers and disk arrays) holds 100 per cent of the application software and all data. Software accessible through Windows-based terminals and the multi-user server software encompass essentially all Windows-based applications, including such standard applications as word processing programs, spreadsheets, email, and Internet/intranet browsers. The result approaches one of the computing industry's most important goals: low-cost universal access to applications located throughout an organisation.

Benefits of Windows-based terminals

International Data Corporation forecasts that Windows-based terminals will achieve wide-spread acceptance in the marketplace, growing at a compound annual growth rate of more than 100 per cent per year for 1998 through 2002.

Windows-based terminals are popular through their ability to:

  • Simplify and improve IT administration through better system-wide control
  • Lower total-cost-of-ownership per user
  • Increase data integrity and security
  • Lengthen desktop longevity
  • Centralise resources

We will now examine these five benefits in greater detail:

Simplifies and improves IT administration

Windows-based terminals require a server-centric architecture that centralises resources, applications and data. This centralisation leads directly to a wealth of productivity improvements and cost savings. For example, upgrades and backups are easier and more consistent with IT practices because IT has direct control for all applications and data. Company-wide practices for security, virus protection, disaster recovery and Internet access is all made easier to plan and implement. Another example is that deployment of Windows-based terminals is fast and straightforward, especially since most companies will implement a standardised desktop configuration. This same uniformity also greatly simplifies help desk support, making it both more efficient and more effective. Moves/adds/changes are also easier.

The list goes on and on. The key point is that Windows-based terminals simplify administration and help create a more controlled environment that supports excellent data integrity, data sharing, and business continuity.

Lowers the total-cost-of-ownership

The Total Cost of Ownership (TCO) for a networked PC has four components: end-user operations, administration, technical support and capital. Gartner Group further divides these four components into asset management, formal and informal end-user training, security administration, new software and updates, moves-adds-changes (MAC), help desk, configuration planning and review and 50 more subcategories.

Projected savings depend on the mix of applications and other factors, but there is agreement that the largest savings come from dramatic reductions in end-user operations costs because Windows-based terminals are so simple to use and are centrally managed. With Windows-based terminals in place, end-users get the full benefits of a graphic user interface, but without some of the complexities normally found in a PC. There is no software to store or update, no local data to secure, fewer parts to configure and repair, fewer opportunities to tinker with the device and much less training to reach and sustain maximum productivity.

For administrators and IT departments, Windows-based terminals allow easier and more effective user support, simplified configuration and repair, secure data in a centralised facility, faster and more uniform software updates, and a way to achieve consistent standards and policies. Decreasing the number of supported platforms results in initial purchase and installation cost savings, and in many cases IT costs are reduced.

Increases data integrity and security

System administrators will tell you that users will do amazing things with desktops. Unfortunately these things may include such dubious acts as injecting viruses from floppy disks, loading multiple versions of a standard company application, or establishing local data stores that become out-of-date. The restricted functionality of Windows-based terminals eliminates the opportunity that can lead to difficulties. Specifically, the terminals prevent users from storing data locally or from introducing new data or software except through key strokes within a standard application. The result ( lower risks and less time spent undoing problems.

Lengthens desktop longevity

Windows-based terminals offer superior investment protection from technical obsolescence when compared with PCs, NetPCs, and NCs. Why? Because there is so little in Windows-based terminals to upgrade. And when a change does occur, administrators can make the update at the server and deliver it over the network. Predictions are that the upgrade cycle is four years or more longer than any other graphic desktop device available.

Centralises resources

This operates on two levels. The obvious one is hardware-related: the servers and disk arrays supporting Windows-based terminals are usually in a centralised location for ease of administration. This means users can share storage, memory, and processing power. On a deeper level, the company's data is also centralised and shared among users, thereby creating an environment that accelerates business processes and improves information sharing and co-ordination.

Issues to consider

Achieving optimal benefit from thin client/server computing encompasses everything from proper sizing of servers and disk storage to network bandwidth and IT procedures and policies. Developing and implementing such total solutions are familiar tasks to IT departments of companies coming from the mainframe and UNIX worlds. However, companies that approach thin client/server computing from a PC-oriented perspective may find that they need to incorporate more robustness and assure more consistency across these areas.

Server performance plays a crucial role

The server delivers applications to end-users in one of two ways:

  • The server performs application processing and sends the results to the user
  • The server downloads applications or components to temporary cache on the thin client device for short-term use by the end user

In either case, the server repeats this delivery process periodically as the end-users perform their tasks. If the server is unavailable or unresponsive, the end user cannot continue the application until the server becomes available. Clearly, the potential for business disruption is high if the server does not perform to end-user requirements.

There are many ways to evaluate performance and set requirements, but two measures stand out: server availability and response time for end-users. Availability is a measure of per cent uptime for the system. In thin client/server computing, minimum availability requirements depend on the criticality and timeliness of the tasks being done by the company's end-users. Because thin client/server computing solutions often involve real-time data entry and data lookup, they generally fall into the category of business-critical systems. Response time for end-users is the length of time between sending a request to the server and receiving the appropriate information at the desktop.

An overloaded server may have excellent availability, yet not meet user requirements because it supports too many users, runs compute-intensive applications, or encounters bottlenecks accessing disk arrays. Further complicating an evaluation is that response time is not a one-time measurement ( it requires continual reassessment. For example, a growing business may find its high performing environment changes into one with poor response time as the number of users and total demand increase. The best solution is to select and install scalable servers that provide an easy, cost-effective path for IT to add more of the overloaded component: processors, memory, or disk arrays.

Availability

Windows-based terminals require an environment of highly available servers. Providing necessary service levels requires a comprehensive approach, beginning with the design of the server infrastructure and of data management strategies. It is essential that the servers and disk arrays be of the highest quality. These systems must have a solid track record for reliability, plus they must contain both redundant and hot swappable components within the server chassis as further insurance against server outages. Another key pillar in an availability strategy is implementing a service and support contract with a service vendor that is stable, knowledgeable, and able to deliver active services as well as to provide a guaranteed response time for repair.

Manageability

Windows-based terminals' support for network and system management tools can be one of the biggest differentiates between thin client offerings. Highly desirable features in a thin client allow system administrators to:

  • Identify attached clients and their configuration
  • Monitor status of attached clients
  • Control upgrades and changes remotely
  • Send on-screen alerts and messages to users

End-user performance

Studies show that end-user performance depends most heavily on sizing the server to user ratio appropriately and that other factors such as network bandwidth normally play minor roles. Items to consider when sizing a server include the number and type of applications being supported, the number of users and their server-access rates, expansion plans, and more. From this analysis, one can determine the most cost-effective number of Intel-compatible processors and memory to put into the server, while leaving the right amount of room for growth, either in users or in the number of applications supported. The application mix in particular is difficult to assess, but has a direct impact on server duty ratios and tuning the operating environment parameters. Transaction-oriented applications, word processors, and email all place significantly different requirements upon the server. There are no set formulas to follow, and locating a partner with prior experience can play big dividends.

Conclusion

Thin client/server computing is a server-centric approach to deploying, managing and supporting applications. The server's reliability, manageability, and scalability play crucial roles in setting user satisfaction and establishing a solid return on investment. There are some infrastructure issues that must be carefully addressed, starting at design and implementation, but an experienced technology partner can help companies work through these issues and optimise the benefits of thin client/server computing. Windows-based terminals are attracting more and more attention, and fuelling the move to thin client/server computing. Their appeal is their simplicity and their suitability for so many situations. As a result, Windows-based terminals are a powerful tool to achieve solid reductions in a company's total cost of ownership for desktop computing without lowering the productivity of end-users.

Compiled by Mike Burkitt

(c) Hewlett-Packard Corporation 1998

Read more on PC hardware

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close