Is P2P ready to do business?

Peer-to-peer is the latest IT buzzword. Ken Young gets behind the hype to see whether it is ready to deliver

Peer-to-peer is the latest IT buzzword. Ken Young gets behind the hype to see whether it is ready to deliver Peer-to-peer technology offers the promise of faster, cheaper and more flexible computing, but pressing issues such as security and access need to be addressed before it can really deliver. Spurred on by the technological...

success of the Napster music file exchange, IT suppliers are developing products that seek to harness peer-to-peer (P2P) architectures for enterprise networks. This latest holy grail of computing is becoming a key focus for Sun, Microsoft, IBM, EMC, and Intel, as well as industry hotshots such as Netscape founders Jim Barksdale and Marc Andreessen, and Ray Ozzie, the founder of Lotus Notes. Most IT managers are wisely shoring up their defences against the expected tidal wave of supplier hype. To date, very little is known about the management or cost implications of using P2P computing, whether across internal or public network. Mike Sayers, chief technology officer at Reuters, has identified two advantages that a P2P architecture provides over and above traditional client-server models: improved information-sharing capabilities and the opportunity to employ spare or dormant processor capacity. "Both of these deliver increased cost-effectiveness to the business," he said. However, Sayers is equally aware of the downside of migrating to P2P or a hybrid architecture. "The future success of commercial P2P depends on solving the trust and control issues," he explained. "There are clear concerns about security and manageability," said Tim Jennings, senior research analyst with Butler Group. "If you look at the P2P offerings at the moment, the issue is what degree of control are we going to have to apply. Clearly, if the P2P is working behind a firewall, that simplifies the issue somewhat but extending beyond the organisation has significant management issues." But the commercial incentive to overcome such technical obstacles is strong. Stephen Dall, partner with management consultancy Accenture, said, "For corporates, the savings created by tapping PCs to share power are enormous and the promise of organic growth is lucrative. It can also allow firms to create new offerings. "Citibank has introduced an online P2P payment service that allows transfer of funds between individuals for a range of personal business transactions such as returning borrowed money." However, this lies some way in the future for most IT directors, who will first have to evaluate the suitability of the peer-to-peer model for their enterprise. IT will need to clarify how suppliers are using the terms, as several definitions exist. Data-centred is the "purist" school of P2P, where users search and access data and content held on other users' systems. The chief advantage is to enhance access to content and collaboration across workgroups. Compute-centred, by contrast, is a method of distributed processing where the main appeal lies in its potential to use spare processing capacity on the network. . The key difference is that the data-centred approach does not have a central directory. To avoid the chaotic distribution that would otherwise ensue, a dynamic index server is a must. This polices a set of rules based on security levels or job functions that govern access to content. Types of file and content can thus be blocked depending on corporate policy. This model of P2P does not increase capacity but rather focuses on better access to content and collaboration. The non-stop availability of the server is the prime resource issue. In compute-centred P2P, where idle CPU clock cycles are allocated for processing applications, the main limitation is that it is only appropriate for calculations that can be broken down into smaller tasks. In the past this has meant that it has been consigned to heavy-duty number crunching tasks in the scientific arena, although corporate applications might include data mining credit card or shipping data. Another example is the rendering of individual frames of 3-D movies. One niche application of data-centred P2P used by retailers - including McDonalds, Burger King and River Island - is Lantastic P2P networking from Artisoft. The Lantastic operating system allows cash tills to be networked in a manner that ensures that if any fall out of the system, others can take over. The main savings are that the clients are thinner and there is no need for the Windows operating system. However, Hilda Breakspear, UK infrastructure manager with book publisher Harper Collins, pointed out, "If you only need to share the odd printer or Internet connection between a handful of users, a full client-server network may be overkill. However, the management problems of administration, back-ups, reliability, training and security in a medium-sized company such ours with nearly 1,000 end nodes, mean that P2P is not an option." Jennings believes that P2P will chiefly interest large enterprises that want to distribute shared content in new ways, particularly to mobiles and personal digital assistants. P2P holds the potential to make bandwidth usage more efficient by storing relevant files and content on users' hard discs. Reuters is very excited about P2P's potential for capacity improvements. "We believe that it does have possibilities for better use of resources and we are closely watching Sun's Jxta development and Microsoft's parallel processing project," said Dave Parrot, senior analyst designer at Reuters. In theory, P2P's efficiency means that demands on bandwidth should decrease. Yet equally, overall usage could increase as the availability of content increases across the enterprise. Tim Waterton, vice-president of marketing with performance management supplier Simulus, noted that integrating content from multiple formats is a big challenge and drain on resources. "It is essential to ensure that data loading and request functions - feeding from and into the central repositories - perform efficiently. In addition, the corporation will need to have the resource capacity to support usage surges, as well as the exponential growth in the number of business transactions," he said. Without agreed standards or a clear market leader, it is still too early for network management suppliers to have a clear view of its implications. The IT department will probably want to regulate user access to P2P services. Bandwidth control devices may have to be implemented to allocate and monitor the usage, particularly with respect to Napster-like services. "We were seeing an alarming increase in the use of P2P applications on our Internet links, and knew that it could become a significant problem for us," said Jim Bourn, manager of data communications at the University of Connecticut. The university installed the Packeteer Packetshaper to set policies to limit the use of P2P applications. "It is a much better use of resources," said Bourn. In the US a market for such devices is starting to emerge. While Reuters remains optimistic about what P2P can do for its bandwidth, the information services provider is worried about how to secure digital copyright. "Our biggest concern is the implication for intellectual property," said Parrot. This loophole was exposed in the Napster case and firms will certainly be looking for clear guidelines from standards bodies, as well as implemented encryption to ensure their content is not pirated or misused. Security is probably the issue that most exercises the critics of P2P. By its very nature, P2P encourages a more open, distributed approach to content and file sharing and this threatens to compromise corporate policies on security. Groove Networks, set up by Ray Ozzie, the man credited with creating Lotus Notes, has a P2P offering that promises to harness the peering model for a more collaborative form of working and knowledge sharing. With Groove, all data moving between clients is not only encrypted, but encrypted to disc, so that if a laptop is stolen the data cannot be used. Groove also stresses that it is not possible for other users to modify information held anywhere else on the network even if they can get access to it. But the model that Groove espouses may concern some IT managers. Groove talks of "innovation at the edge of the network", referring to the fact that personal control increases with such a network. In fact, it increases to the extent that where previously collaborative working and file sharing were controlled centrally, they can now be initiated by individual users on an ad hoc basis. This is clearly more efficient, but the security implications are many and various. Groove believes progress on this front may depend on moves towards a general-purpose platform for P2P. This platform could include a variety of services such as application development capability, user identity and awareness, information security, information transfer and synchronisation, and task co-ordination. It remains to be seen if such a platform will be delivered by one supplier, by the emergence of de facto standards or a combination of both. Virus threat Views in the anti-virus industry on threats posed by peer-to-peer are mixed. McAfee is working on a P2P product, although it does not believe that risks are high unless the P2P architecture is linked to operating systems in a similar way that Microsoft Outlook is. Recently a proof-of-concept computer worm was released for Napster alternative Gnutella. This demonstrated the potential success of a virus. In contrast, anti-virus firm Sophos said that no special code is required and that existing anti-virus protection is adequate. While a growing number of firms purport to offer infrastructure tools to build P2P services, Jennings said, "P2P is not a protocol but a style of working. I think we have yet to see a supplier come up with the first real application that offers us something new. I think a mobile collaboration application is the most likely." Nonetheless, the inevitability of suppliers spinning P2P as product and the diminishing in-house resources of most IT departments points to the emergence of a raft of outsourced services in this field. Groove admitted that, "Peer computing tools and skills are scarce in comparison to the technology and talent available for Web applications." In the UK that is probably an understatement, with few people aware of peer technologies, let alone training to upgrade their skills. The dearth of skills may serve to sharpen the cynicism of IT directors who have seen it all before. "New ways of utilising spare capacity have to be a good idea, but my concern is that there are more tools out there than there are people to use them," said John Handby, interim CIO with CIO Connect. "It makes me wonder if the world needs another new technology. We could be making better use of existing Internet technology. "It only takes one lateral thinker to come up with something that could create all kinds of trouble. We have seen that happen enough times before." The outsourced model is more likely to prevail where the goal is better access to content, which is largely held externally. Netscape founders Barksdale and Andreessen recently teamed up again to promote a firm called Zodiac Networks that is likely to operate in this arena. Zodiac aims to create services that will enable corporate use of the Internet to be optimised through more effective storage and caching of required content using the P2P model. In a typical arrangement a user selects a range of required information, which is then stored locally and is available to others. This saves the time it takes for multiple users to download from the Internet each time. One of the key issues with outsourcing is that most corporate users of P2P will require some central registration and authentication. It is likely that firms will want this to be an in-house activity rather than something held on external servers operated by third parties. End-user companies also need to be wary of implementing architectures that require them to sign contracts that bind them to one supplier for long periods of time. Last year, Gartner lambasted Oracle for its power unit pricing scheme that would probably increase costs for users with little possibility of renegotiation. It also seems logical that ISPs and ASPs will begin to offer forms of P2P infrastructure as they continue to seek to deliver higher value services. This will once again raise the issue of service level agreements in relation to such offerings. Recent research by Klegal, the legal arm of KPMG, found that service level agreements with ASPs lacked the rigour required to ensure the kind of service that leading firms expect. Few contracts examined had properly taken into account issues relating to liability and movement of servers when contracts change, for example. Parrot reflected a widespread view when he said, "You have to be looking at P2P. You have to consider how you might use it internally as well as externally to deliver new or existing products to your customers. It is clear that at the moment there are lots of issues of concern relating to standards, trust and interoperability but we are confident that it is a technology that has an application or many applications." With giants such as Microsoft working on projects to link millions of computers to create a global P2P network and Intel developing a Virtual Private Web that will harness P2P in a bid to remove problems of using the existing Internet, there is the possibility that P2P will deliver a new architecture, and new ways of using the Web. In the meantime, IT directors must consider whether to develop a strategy now or wait a few years while such high-profile projects take shape. How does it work? File sharing This brings Napster-like capabilities to the corporate space, enabling different users to share files directly from their machines without needing to go through a server Process sharing This involves brokering spare CPU time on client machines across the network to complete tasks more quickly and make more efficient use of computing resources. What can P2P do for the business? Peer-to-peer is being touted as a revolution for companies and the way business is conducted. It is hailed as a panacea that will relieve network bottlenecks, enable collaboration within ad hoc workgroups, and unleash untold computing power from underused processors throughout an enterprise. The technology is no newcomer to the IT department. In the 1980s Artisoft led what has now become known as the P2P market with a product called Lantastic, which even included extensions to allow a handset to be connected to the PC for making internal phone calls to similarly-enabled colleagues. Lantastic's leadership was challenged when Microsoft released Windows for Workgroups 3.11 which had P2P capabilities, in the form of file-sharing, built-in. Users could determine which directory files they wanted to share with others. The renewed interest in P2P is the ability to stretch peer interactions beyond the confines of the corporate network and out over the Internet. The possibilities are intriguing because, in theory, any device will have access to any data and could recruit idle systems on a remote network to give a boost to underpowered processors, such as those found in handheld PDAs or Wap-enabled phones, possibly turning the attached devices into thin clients. From the processor manufacturers' perspective, they are well-aware that chips long ago outstripped most users' needs and that, if the power spiral is to continue to soar, new benefits have to be offered to prevent shrinkage of potential PC sales. Through P2P, any extra power can be "pooled" to allow compute-intensive tasks to be shared across the network potentially turning the network into a device with the power of a supercomputer. Peer-to-peer is being touted as a revolution for companies and the way business is conducted. It is hailed as a panacea that will relieve network bottlenecks, enable collaboration within ad hoc workgroups, and unleash untold computing power from underused processors throughout an enterprise. The technology is no newcomer to the IT department. In the 1980s Artisoft led what has now become known as the P2P market with a product called Lantastic, which even included extensions to allow a handset to be connected to the PC for making internal phone calls to similarly-enabled colleagues. Lantastic's leadership was challenged when Microsoft released Windows for Workgroups 3.11 which had P2P capabilities, in the form of file-sharing, built-in. Users could determine which directory files they wanted to share with others. The renewed interest in P2P is the ability to stretch peer interactions beyond the confines of the corporate network and out over the Internet. The possibilities are intriguing because, in theory, any device will have access to any data and could recruit idle systems on a remote network to give a boost to underpowered processors, such as those found in handheld PDAs or Wap-enabled phones, possibly turning the attached devices into thin clients. From the processor manufacturers' perspective, they are well-aware that chips long ago outstripped most users' needs and that, if the power spiral is to continue to soar, new benefits have to be offered to prevent shrinkage of potential PC sales. Through P2P, any extra power can be "pooled" to allow compute-intensive tasks to be shared across the network potentially turning the network into a device with the power of a supercomputer.

This was last published in May 2001

Read more on Business applications

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close