Progress in computer technology over the past four decades has been spectacular, driven by Moore's Law which, though initially an observation, has become a self-fulfilling prophecy and a boardroom planning tool.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Although Intel co-founder Gordon Moore expressed his vision of progress simply in terms of the number of transistors that could be manufactured economically on an integrated circuit, the means of achieving this progress was based principally on shrinking transistor dimensions, and with that came collateral gains in performance, power-efficiency and cost.
The semiconductor industry appears to be confident in its ability to continue to shrink transistors, at least for another decade or so, but the game is already changing. We can no longer assume that smaller circuits will go faster, or be more power-efficient.
As we approach atomic limits, device variability is beginning to hurt, and design costs are going through the roof. This is impacting the economics of design in ways that will affect the entire computing and communications industries.
For example, on the desktop there is a trend away from high-speed uni-processors towards multicore processors, despite the fact that general-purpose parallel programming remains one of the great unsolved problems of computer science.
If computers are to benefit from future advances in technology, there will be major challenges ahead, involving understanding how to build reliable systems on increasingly unreliable technology and how to exploit parallelism more effectively, not only to improve performance, but to mask the consequences of component failure.
Biological systems demonstrate many of the properties the industry would like to incorporate into its own engineered technology, so perhaps that suggests a possible source of ideas that information technologists could seek to incorporate into future computation systems.
With this in mind, the Computer Journal is presenting a lecture entitled "The Future of Computer Technology and its Implications for the Computer Industry" on 12 February at the BCS offices in London.
The lecture will be given by Steve Furber, ICL professor of computer engineering at the University of Manchester.