Forty years ago Gordon Moore predicted with uncanny accuracy how the computer industry would evolve, revolutionising businesses and homes with greater levels of automation based on ever more complex computer chips.
As researchers strive for greater and greater levels of sophistication at lower and lower costs, one wonders how long it can go on. What uses will they find next for these tiny, sophisticated devices and will there ever be enough computational power?
But as chip makers tackle these issues, users are reluctant to keep up with the pace of change the industry is generating. PCs used to be replaced every 18 months; now four years is the norm. Even the relentless churn in mobile devices will slow down eventually.
Businesses need stability and predictability: if they standardise on technology today, they expect it to be in use for several years. It is hard to imagine how the industry can continue advancing at such pace with users still several generations behind. And as computers become more complex, the search for that elusive killer application gets even harder.
Forty years after Moore's Law was posited, it is time to take stock: cheaper, faster processing should no longer be the only driving force in IT.