By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
What goes around comes around, or so they say. And nothing comes around as regularly as trends in the computing industry. The emphasis on 64-bit computing has renewed the emphasis on centralised computing, which was the backbone of the industry during its early years.
Now 64-bit servers are powerful enough to rival mainframes and retain the openness of Unix and Windows-based architectures. IBM, Hewlett-Packard and Sun Microsystems are pushing 64-bit computing to customers heavily, and are enabling companies to consolidate their distributed middle-tier servers into small numbers of central servers, or even a single box.
Gary Barnett, analyst with market research and consultancy firm Ovum, explains that even though he's a specialist in distributed computing, he sees it as something that you have to do, rather than something a company should want to do. Monoliths are much better when you can get away with it, he says, as "server consolidation takes away the complexity and the hassle of administering multiple boxes".
The key driver for many companies' decision to move to 64-bit computing is the larger in-memory address space on offer, according to Simon Robertson, IBM's product manager for P-Series systems and Unix. Applications needing fast access to lots of data at once can get it, rather than having to use virtual memory - a procedure involving swapping data to and from disk and memory.
IBM has been selling 64-bit systems since and its P-Series Unix range consists mostly of 64-bit models. IBM's adherence to the PowerPC, the processor that it developed with Motorola in the early 1990s, has enabled it to control the development of the processor's architecture, much as Sun has done with the development of its 64-bit UltraSPARC platform. Both companies started selling the higher end of the market and have now moved downwards.
Meanwhile Intel, with its base in 32-bit consumer and lower-end server processors, is still bravely trying to muscle in with Itanium, the 64-bit processor that it developed with Hewlett-Packard and shipped last year. Things are not looking very bright for servers based on the processor, which IBM and HP are selling in low numbers, mainly to testers and developers. Marc Bothorel, EMEA Itanium program manager at HP expects the market to kick off with McKinley, the successor to Itanium, which ships early this year.
HP still has its own 64-bit architecture in PA-RISC, its established processor platform which is its bread-and-butter product in spite of its efforts to sell Itanium-based kit too. And Sun is selling 64-bit computing as a matter of course by putting its UltraSPARC processor in even its entry-level servers.
Sun still concentrates on selling a 64-bit version of its own Solaris operating system, and likewise, in spite of its £1bn investment in Linux in December 2000, IBM does not resell the operating system, choosing instead to support it with applications and Linux-compliant servers.
As the other top-three server supplier, HP is reselling Red Hat's Linux, which is now available in a 64-bit version, and it plans to sell a version of Debian (which it considers to be the purest form of Linux, according to Bothorel) for McKinley when it ships.
In the meantime, Microsoft has its 64-bit Windows available in both Windows 2000 Advanced Server and Windows XP client versions (with a Windows .net Server version shipping later this year). The client version targets scientific and engineering types.
At the server level, Bothorel is finding support among existing NT users who want to take advantage of larger SQL Server databases in RAM without having to move to another company's operating system platform. However, in banking or technical or education, "most people want HP-UX or Linux", he says.
Clearly, many users are buying 64-bit servers without being aware of what's underneath the hood; only the large scientific and technical number-crunchers will take full advantage of the processing power on offer. Those that do will find that it is not without its headaches. For one thing, high-speed interconnects will be needed more than ever.
Large, 64-bit servers will want to talk to their storage devices, and to other 64-bit monster servers too if a cluster is used. The amount of data being exchanged and the speed at which the server pumps it out means that the pipe has to get fatter.
The most promising connection technology here is Infiniband, which promises to solve the complex interoperability problems that customers currently suffer from with Fibrechannel. The fact that Infiniband will achieve a theoretical 6Gbyte/sec in comparison to Fibrechannel's 2Gbyte/sec is a clear advantage. For companies wanting to take a more conventional route, iSCSI adheres to TCP/IP communications protocols, which will make it easier to integrate into existing networks.
Until more 64-bit applications are produced, the number of companies using this type of processor will remain relatively small and limited to those that have huge in-memory databases. Large 64-bit machines may also offer more muscle power in terms of running multiple instances of 32-bit programs but SMP 32-bit systems are equally capable for this us.
Indeed, Microsoft recommends that you stick with a 32-bit version of Windows unless you're specifically planning on running 64-bit software because it warns that the translation involved in running 32-bit applications on a 64-bit operating system will incur an overhead.
Barnett sums up the situation currently facing users: "64-bit won't radically transform the economics or the way that we develop applications anything like the change from the 286 to the 386 did." Which begs the question, why do customers buy 64-bit processors at all? The answer is simply because the larger vendors aren't giving them much choice.