Return of the bright idea

Old IT concepts never die, they just get rebranded. Danny Bradbury looks at seven technologies that were hyped as the next big...

Old IT concepts never die, they just get rebranded. Danny Bradbury looks at seven technologies that were hyped as the next big thing but fell by the wayside and asks if there is a possibility of a comeback

There is nothing quite as cyclical as the IT industry. New concepts are overhyped and sink into what industry analyst Gartner Group calls the "trough of disillusionment". From there they either gain gradual acceptance, are repackaged as new, improved versions, or are simply abandoned. Here are seven also-ran concepts from the past decade that some hope may yet make it.

Push technology

What was it?
Push technology was designed to deliver content to networked devices from a central server. Initially targeted at consumer-focused content such as news and ticker symbols, push systems would deliver content based on end-user subscriptions, which was then displayed in various forms on a PC (including multimedia screen savers).

As the technology developed, companies hoped to use it for systems management, pushing software updates and patches to the client.

What happened to it?
Companies realised it was easier to bring Mohammed to the mountain. Why force updates on end-users who may not need them? And why subscribe to a system that sends you information using special software when you can sign up for an e-mail newsletter instead? Instead, users would instruct their software to go and grab updates when necessary.

Chances of making a comeback?
Limited. Push technology is still around in some system management applications, but for human-readable content it is all but dead. Instead, with the evolution of Really Simple Syndication (RSS) as the foundation for web logging and content syndication online, pull technology is gaining ground and promises to be the next big revolution on the web.

RSS is XML-based and is accessible using newsreader software that will make the web as we know it less relevant. RSS users can simply point software such as RSS Bandit at an RSS feed listed as a URL and get their syndicated content dropped into the software for later consumption.


What was it?
Like push technology the application service provider model was going to revolutionise the web, especially for small businesses that could not afford expensive hardware and software purchases. Companies would be able to rent software applications online and their data would be held on the same remotely managed servers, leaving them with an easily manageable monthly fixed fee.

What happened to it?
The dotcom industry. Inexperienced companies tried to cash in on the ASP revolution, leading to poor service levels, patchy support and fly-by-night operations.

Small businesses were also worried about the security implications of hosting their data off-site. The limited customisation options in some of the software also made it less relevant to potential customers. Interest dropped off and projects were canned. When the IT recession hit, the game was up.

Chances of making a comeback?
Customer relationship management suppliers are trumpeting the ASP model again with companies such as, Siebel and Microsoft all offering CRM solutions for online rental for small and medium-sized firms. BT offers a range of hosted solutions (including Siebel's) and many large suppliers are relying on their industry reputations to rebuild customer confidence in the model.

But many of the old problems still exist. We asked a UK spokesperson for Netsuite, an Oracle-backed CRM ASP about service level agreements and he could not tell us anything because everything for UK customers is hosted in the US office.

Unified communications

What was it?
Would it not be useful to get your voicemail, e-mail and faxes on your PC so you could leaf through all your communications wherever you were, via the internet? That was the idea behind the unified communications concept. Using soft PBX technology, phone calls would be routed to a server and recorded before being made available to the end-user. Communications would be more intuitive and no-one would ever lose a fax again.

What happened to it?
The convergence of telecoms and IP was vital to this concept's success. It did not happen as quickly as it was meant to and too many corporate telephony and computer networks remained separate, says Mark Blowers, senior research analyst at Butler Group. Without the underlying platform in place, unified communications was a non-starter.

Chances of making a comeback?
This one looks promising. Voice over IP is finally catching on now that the carriers themselves have moved from circuit-based systems to packet-based connections. With companies such as Cisco selling IP phones, convergence is finally beginning to happen.

Technologies such as the Session Initiation Protocol are making it easier to merge different communication streams together on packet-based networks and, over the next five years, companies will find it easier to make the leap.

In the meantime, instant messaging has appeared and any unified communications product worth its salt will need to take advantage of this.

Public key infrastructures

What was it?
PKIs are frameworks for the distribution and use of digital certificates. Users retain a private cryptographic key and distribute a public one. Like a digital wax seal, using one with the other enables users to prove that they created a document, and to verify that it has not been tampered with. Certification authorities such as Verisign distribute certificates and ensure lost or expired ones are revoked.

What happened to it?
Although security geeks understood it, the interface for PKI was not intuitive enough for the average user, says Jay Heiser, chief analyst at security services firm TruSecure. Putting certificates onto smartcards is an alternative, but smartcards and readers are expensive, making them a niche medium and not suitable for the average e-commerce consumer.

Chances of making a comeback?
PKI is unlikely to make a huge comeback overnight for three reasons. First, the complex interface problems have not been resolved. Second, the problem of verification remains - commercial services do not want to make it too difficult to verify a customer's identity when signing them up because customers will stop using them, says Heiser.

Finally, commercial services such as online book and clothing retailers mostly consider passwords and credit card security codes to be secure enough for their purposes, he says.

Network computing

What was it?
The network computer was already an old concept when it was introduced in the mid-1990s. Holding data centrally on a monolithic server and manipulating it centrally was what computing had been based on since the beginning. But Sun introduced a new twist with Java, which it hoped would provide downloadable software components that could be run on the client.

In this way, the network computer would provide an alternative to fat Windows applications, in which 95% of the software's functionality sat on the client but was never used. In the new model, a network computer would only download the wordcount or picture drawing components of the word processor when they were accessed. The result? Cheaper desktop machines with fewer hardware requirements, lower power drain and a reduced cost of ownership.

What happened to it?
Customers liked their PCs too much. Unless an employee is working in a very structured way (such as a data entry clerk) users needed the additional functionality. So companies took the best part of the network computing model - the browser - and ran it on a fat client.

Sun and IBM tried to make a Java-based thin-client operating system work in the mid- to late-1990s but it flopped. Instead, companies such as Microsoft and Citrix created Windows-based thin client systems that accessed central server-based applications.

Chances of making a comeback?
Over the next few years the number of thin-client desktops will grow to 4%, says Butler Group analyst John Holden. Although that is a lot of clients, it is still a small chunk of the market. There is hope for the thin-client market in mobile computing, as smartphones become more prevalent and their small memory footprints require software components to be downloaded piecemeal. This is one area that Sun Microsystems is relying on to drive the thin-client computing model forward.


What was it?
Third generation mobile networks were going to revolutionise mobile communications. Smartphones and high-bandwidth devices would enable PC-quality web surfing, videoconferencing and ubiquitous computing where people were always connected, wherever they were.

What happened to it?
A round of expensive licence auctions put mobile network operators on the spot. To miss the 3G goldrush would have been commercial suicide, but if they wanted to licence the bandwidth they had to pay. Technical challenges also made 3G roll-outs difficult.

Julie Ramage, senior analyst at telecoms watcher Analysys Research, used to monitor estimated 3G roll-out dates. "Suppliers kept putting them back. Then most operators stopped reporting any advertised launch dates," she says. The services that have been launched have been plagued by disappointing handsets with poor battery life, she adds.

Chances of making a comeback?
It will happen. Orange plans its UK 3G roll-out later in the year and Vodafone is providing some services now. But operators face a long wait. Analysts expect 3G revenue to lag behind 2G (GSM) until 2007, and it will not pull ahead of 2.5G (GPRS)-based revenue until 2008. Increasing the average revenue per user relies on value-added digital services, rather than commodity airtime, so the time lag is going to hit the mobile operators hard.


What was it?
Videoconferencing was meant to make business travel less necessary, enabling people to conduct meetings from the office and reduce both commute and travel times. Originally restricted to roll-around units using several ISDN lines strung together for broadcast-quality signals, videoconferencing was a boardroom phenomenon. As PC power and internet connection speeds increased, the idea of desktop videoconferencing emerged.

What happened to it?
A mixture of high prices, poor interfaces, inadequate quality and technological glitches conspired against videoconferencing. Martin Langham, practice leader for content management and collaboration at Bloor Research, says companies found it difficult to keep all the ISDN lines operational when using expensive boardroom systems. Meanwhile, on the desktop, image quality rarely reached the necessary 24-30 frames per second. Audio/videoconferencing remains difficult to set up, even today, thanks to corporate firewalls and unintuitive interfaces.

Chances of making a comeback?
Things are improving. Desktop cameras from companies such as Apple and Polycom use better lenses with autofocus and handle image processing in hardware. Instant messenger software is making it easier to get past the corporate firewall and the prospects for inexpensive videoconferencing of reasonable quality are improving.

Read more on Voice networking and VoIP