When J Presper Eckhert and John William Mauchley of Pennsylvania University conceived the idea of a fully programmable electronic computer in the early 1940s, they were proposing a vision of computing that many thought was impossible.
Even media giant RCA passed up the opportunity of becoming a contractor for the project, thereby losing the chance to be in on the computing industry at the ground floor. Many laughed at the two young academics but, with a small team of engineers, they built Eniac, heralded as the first all-electronic computer designed to be reprogrammed to solve different problems.
There have been other visionaries in the history of computing, as our Unsung Heroes feature (Computer Weekly, 28 June) clearly demonstrated. And then there are Robert Noyce and Jack Kilby, who independently invented the microchip. Vinton Cerf has been heralded as the father of the internet for his work on internet-related packet technology, including TCP/IP. Tim Berners-Lee gave us the World Wide Web. These are industry-changing technologies that have caused us to move in new directions, and the people that introduced them deservedly stand as icons in computing history.
But where have these visionaries gone? Tim Berners-Lee gave us the web 16 years ago. Which icons of inventiveness have emerged since then? And if there are none, why?
"It is hard to recognise the visionaries when they are there," says David Patterson, president of the Association for Computing Machinery. Many people lauded as visionaries by the computing industry in hindsight were dismissed as crackpots when they originally present their ideas, he says.
Anyway, says Nigel Shadbolt, vice-president of the British Computer Society and professor of artificial intelligence at the University of Southampton, the nature of the computing industry has changed. "There is an interesting thesis that visionaries are as rare as they have always been, but what we have had is more collaborative groups working together," he says, adding that this trend can be found in everything from gene sequencing through developments in IT and even nanotechnology.
"The idea of group-based computing is there to support the notion of distributed 'co-laboratories' of scientists working on a global scale confronting big problems," says Shadbolt. "That is the only way a lot of modern science can get done. Nobody owns the big physics anymore."
Small groups of individuals are often best suited for making breakthrough discoveries, but larger groups supported by collaboration technologies can be particularly good at refining and enhancing such discoveries. After all, large groups of individuals do not have access to the laboratory equipment and other resources needed to drive a challenging research propositions such as quantum computing.
But even where small groups of individuals are involved, things are changing. Martin Illsley, director of the European research and development team at Accenture, says researchers are becoming more specialised. "Many years ago, PhDs were broad in nature because it was an open area," he says. When the computer was in its infancy, it was easy to talk about computing technology as a discrete PhD subject. "These days we are down to very small specialist pockets of PhDs, so if you want to do something, you have to pull together teams of specialist knowledge."
This has its advantages, in that we understand each subset of a subject more precisely. On the other hand, it makes the production of IT visionaries much more difficult, because the work among these small research groups is more collaborative.
Refinement and the discovery of discrete new technologies sometimes go hand in hand. For example, we marvel at the fact that we can now fit the power of a whole Eniac onto a silicon chip, but this happened both because of a groundbreaking discovery (the integrated circuit) and successive rounds of miniaturisation. Generally, one follows the other to create a step curve in innovation.
An initial groundbreaking discovery prompts a frenzied cycle of development - what Jonathan Smart, developmental systems theorist and president of the Acceleration Studies Foundation calls the "diffusion curve" - leading to the refinement of the technology into stable, commercially acceptable products.
The technology is then refined still further, as products are enhanced and prices lower. Eventually, the technology and associated products mature into a commodity, and then a new technology emerges to begin the cycle once again.
Many industry commentators attribute more value to the diffusion curve where a technology is commercially applied than to its initial invention. "The real innovation comes when it is widespread enough that a whole range of people start to use it for things that were never conceived of," says Steve Prentice, chief of research, hardware and systems at analyst firm Gartner. "I would separate the initial technological development from the innovation that arrives from a better understanding of it."
One of the best examples of a technology that has become a foundation for unanticipated innovation is the internet. The visionary aspect of the internet lies in its ubiquity, says Prentice, who identifies peer-to-peer and podcasting as examples of this innovation.
Prentice cites the mapping of the human genome as another area where the real innovation comes later. We now understand the pattern of hereditary information encoded in our own DNA, but we do not understand what it all means or what we can do with it. That is where innovation will emerge, he says.
No wonder, then, that Brian Levy, group technology officer at BT, singles out Steve Jobs as one of the more recent visionaries in the IT world. Jobs did not develop any groundbreaking technologies himself (he came up with the idea of the graphical user interface for the Apple Lisa after visiting Xerox Parc). He bolted together existing technologies to produce the Macintosh, and more recently, the iPod.
"You probably cannot find one thing that Apple has got that was not already there, but it is in the integration of that, and the vision that Jobs has to bring them together in that way," says Levy. "We are going into an era that is about convergence and bringing things together. It is about making things accessible for people in new ways."
Part of the diffusion curve that Smart discusses involves the recombination of existing technologies in easy-to-use ways. When we think about the time between major computing hardware developments such as the integrated circuit, which appeared in the early 1960s, and quantum computing, which is still 20 years from commercial implementation, it is no wonder that the miniaturisation afforded by Moores Law - that computing power doubles every 18 months - becomes so important, and that convergence plays such a large part in innovation today.
Shadbolt's experiences suggest that the UK is a hothouse of invention. He says his research budget has increased substantially over the years.
In contrast, the American Association for the Advancement of Science has condemned the Bush administration's budget for 2006, which it says will put research - especially defence department research - below the rate of inflation for the first time in a decade. This is significant because the Defense Department has often driven technological developments in the US. It produced technologies from the computer to the atom bomb and the internet, for example.
Patterson also lambasts the US administration for cutting back on pure academic research, but adds that the changes are even more insidious. The Defense Advanced Research Projects Agency (Darpa), which carries out much defence research in the US, has shifted its focus, he says. "In the past, the focus was on dual use - things that would help the military and help industry," he says, arguing that the focus is now solely on military applications.
Even in industry, we should take reports of huge R&D budgets with a pinch of salt, warns Patterson. It is important to separate the research from the development. When Microsoft says it will spend $6bn on R&D this year, how much of that involves long-term research, and how much is simply short-term product development?
While industry watchers mull over this problem, Smart says something bigger is afoot. In the past, most innovation has been a top-down affair. The computer industry was predicated on presenting solutions to solve problems that people did not know they had. It took some years for the creators of the Eniac to develop a market for their machines, because people did not understand what they were for. Now, says Smart, that top-down innovation is changing as developments such as convergence make the diffusion curve more complex.
"IT suppliers used to be the drivers behind technology innovation, and supplier relationships used to be the drivers behind diffusion," he says. "The idea of collaborative networks of consumers is changing that. Today, it is more driven by the customer. It is more of a pull than it is a push, and that fits in with this idea of a network society that we are moving to."
This move from a top-down, supplier-driven innovation model within the diffusion curve to a bottom-up, customer-driven one is attractive. Smart's view may be a little Utopian - IT suppliers are still far from passive - but nevertheless, customers are learning how to use and apply technologies in more innovative ways. If he is right, and the trend continues, it heralds a switch.
In the past, we have lived within a modernist industry in which one voice dictates to a passive audience. Smart is describing a new, postmodern ideal in which multiple voices participate, and in which customers take part in a dialogue that drives research forward. Perhaps that is the most innovative development of all.
This was first published in August 2005