It is amazing how off-track science fiction can be when predicting the future. Films made in the early 1960s and set in the late 1980s showed people whizzing around in flying saucers and wearing silver jumpsuits, while any 1950s book trying to depict life in the year 2000 generally described citizens using levitating cars and living in houses in the sky.
While our feet - and our cars - have remained firmly on the ground, technological progress has had a huge impact on society in other ways. However, these changes took a little longer to get going than people first predicted, according to John White, CEO of the Association of Computing Machines (ACM), an industry body promoting scientific and educational computing founded in 1947. White joined the ACM in 1999 after spending 18 years working at Xerox Parc, the research group set up by Xerox, now famous for its technological innovation but infamous for its inability to commercialise it.
White argues that computing did not have an immediate impact on daily life until the advent of the microcomputer, a hobbyist market in the late 1970s, which exploded into the business world with the advent of the PC in 1981. At this point, computers were finally put in front of employees, and presented as tools to minimise their workload and maximise their output.
At least, that was the idea. Paul Ceruzzi, a curator at the Smithsonian Institution's National Air and Space Museum, and author of several books on the history of computing, says that the PC revolution went largely unrealised until networking really took off. "I said in 1991 that computers were wonderful but they still could not communicate with each other," he says. Since then, adds Ceruzzi, the revolution in communications has exceeded his expectations.
What is interesting is the way in which the computing industry has fallen short of certain expectations, says Ceruzzi. "2001: A Space Odyssey will come of age this year," he says, recalling the film of Arthur C Clarke's book, in which an artificially intelligent computer [HAL] begins to make decisions on behalf of its masters. "Everyone is thinking of voice recognition but we do not have the same type of human understanding that HAL had. We are not very near it, and no-one seems very optimistic." Yet clearly, it was imagined that this would happen in the real world - replace each of the letters in HAL with the one immediately following it in the alphabet, and what do you get?
Indeed, as Ceruzzi says, technology has progressed in an unpredictable fashion so far. For example, with notable exceptions (think about the miniature computers that doubled as communicators in Star Trek, for example), many depictions of future computer technology in the science fiction of yesteryear involved huge boxes with flashing lights and whirling tapes. These were replicas of computers that already existed. In fact, computer formats have changed dramatically. "You can have tiny things that can be implanted in people's skin. Each thing will have its own Web address. That is not science fiction, that is happening now," says Ceruzzi. Although, to be fair, Kevin Warwick, the UK science professor who has made a name for himself by embedding chips into his body so that it can communicate with computers in his office and house, has attracted as much ridicule as admiration.
While science fiction such as that by William Gibson explores the possibilities of merging the body with the machine, real world organisations are trying to move technology forward as fast as they can. The signs also indicate that development cycles are getting shorter. Technology generally develops in waves - at the peak of a wave a new technological innovation is made, although it is still under-utilised and generally expensive. As the technology becomes a commodity, everyone concentrates on commercialising it, creating a trough in which the industry's energy is focused on developing an existing technology. As the commercial and technical possibilities of the technology mature, many start looking for new technologies with higher margins, and a new wave begins.
In recent years, this seems to have been happening more quickly. Technology developments in the microprocessor industry, for example, are moving ahead at an amazing pace thanks to Moore's Law, which said that the number of transistors on a processor would double every 18 months. When they approached the physical limits of fabrication technology, companies have considered moving to new processes such as X-ray-based fabrication, for example. "You can even anticipate a research breakthrough, which sounds impossible when you describe it, but in fact computer manufacturers account for increases in processor speeds," says Ceruzzi.
Part of the reason for this is that researchers have to be more accountable in the commercial environment. White believes that traditional pure research is under threat because research operations these days are under pressure to deliver commercial goods. "Internet start-ups and others are having to deliver innovation in reduced time cycles, so what is the role of long-term research in a competitive environment?" he asks. "Many organisations that had established research centres are wondering what to do with them. It really redefines the game."
Nevertheless, some believe that commercial research will be responsible for the next quantum leaps in computing technology. One such company is Hewlett-Packard, which has been conducting research into molecular storage that, HP scientists believe, will deliver consumer products within the next 10 years. Both commercial and academic developments are contributing to an unprecedented acceleration in technology development within a number of key areas.
Over the coming months, Computer Weekly will examine different areas of technology in detail to find out what you can expect in the coming years - and when.
Analysts continue to run riot predicting ridiculously steep growth curves for new technologies, and in some cases they are accurate, but such growth curves have become a clich‚ in the IT industry. The truth is that it is difficult to predict with any accuracy how accepted a certain technology will become, especially as new and explosive industry developments can catch vendors off guard. Nevertheless, it is possible at least to look at what is going on in the labs now, and to make predictions about if and when these developments will become commercially viable. It is going to be an exciting series.
Look out for The Future of IT: Part II in two weeks' time
Fifty years of ICT - the highlights
According to the ACM's John White, the following are the three most significant developments in computing technology in the last 50 years:
The introduction of the PC
Prior to the PC, computing was available through time sharing systems, but it was not available to employees on an ad hoc basis, and certainly was not universally available in the home. Consequently, it was used in large-scale data processing, but the social impact was indirect, resulting in, for example, more efficient processing of customer records or social security information. The creation of personal computers and related software enabled individuals to process information and carry out tasks more rapidly. The spreadsheet and the word processor are two of the most important software developments in the history of computing.
Distributed client-server computing
The problem with personal computers was that they only provided access to information on the mainframe (if you were lucky). What was needed was a more localised client that could be used to distribute information to PCs and also to share the workload with it. Finally, the PC could be used to process data provided by the back-end servers. Distributed client-server computing was rapidly adopted by the industry. But White still does not think the true productivity potential of the PC client was realised until the next major landmark:
The widespread adoption of the Internet
The Internet, which has been around since the late 1960s, finally became commercially significant in the mid-1990s and grew quickly to be a standard means of communication. The industry-wide acceptance of the TCP communications stack, which standardised communications at most levels of the old open systems interoperability (OSI) stack, facilitated this and was therefore one of the most important developments in the last 20 years.
The Future of IT series will explore the different areas of technology in which significant progress is being made, outlining what developments are expected, and when. We will also examine the impact that they are likely to make on our working lives.
Networking technology is already undergoing a revolution with the introduction of quality-of-service technology such as Diffserv and Multi Protocol Label Switching (MPLS). But some pundits are anticipating further developments in the area of self-configuring networks that create and destroy their own logical subnets based on traffic patterns. And there are other significant areas of progress, including the slow development of 2.5G and 3G cellular networks, and the related issue of pervasive computing, in which networked services are accessible from a variety of devices over different communications media.
Companies have been pushing the envelope in the microprocessor market by using lithography techniques to create chips with increasingly smaller die-sizes. As they reach the limits of physics, companies are exploring techniques such as X-rays and other special light wavelengths to create even smaller electrical components on silicon. Moving even further ahead, technologies such as molecular and quantum computing are appearing on the horizon.
Display technology is moving beyond the simple cathode ray tube as LCD screens become more ubiquitous. But other display technologies are nearing commercial availability, including bendable displays on light emitting plastic, digital paper that acts like conventional paper but can be wiped clean and rewritten on, and even retinal projection glasses.
With security becoming an increasingly important issue in corporate networks and on the Internet, vendors are trying to find different ways to authenticate individuals. Fingerprinting has been one of the first biometric techniques, but others such as retinal scans have been making an appearance, even being trialled in bank cash machines.
Things have come a long way since the early days of DOS. Simple GUIs gave way to more complex interfaces where you could drag and drop files onto each other while controlling their properties. Now, multimedia is finding its way into desktop user interfaces, with 3D and even virtual reality components appearing. Other advances include natural language interfaces, while PDAs and telephone access create the need for handwriting and voice recognition.
With robotic dogs now on retailers' shelves, robotics are capturing the public imagination. Such products are toys, but in the labs work is being conducted on robots that change their facial expressions in reaction to people's own smiles and frowns, while walking robots are also taking their first tottering steps.
The general trend in the storage industry has been to reduce the price of storage per Mbyte while increasing the density of tape or disc. Companies are taking it a step further by working on innovative new media, including blue laser optical disks for ultra-dense storage. Further down the line, molecular storage and holographic media are in the labs. In the short-term, storage service providers are already providing virtual storage services across the Internet.