Moore predicted that chip power – measured by the number of transistors on a chip – would double every 18 months, driving down the cost of computing.
Said the godfather of the microprocessor, speaking on the fortieth anniversary of the publication of his visionary law this morning: “Nothing like this can continue forever. The dimensions are small enough now that we are approaching the size of atoms. I can see that [the law] will progress for the next two to three generations. We have 10 to 20 years before we reach a fundamental limit.”
Even so, developments in chip design will allow engineers to put 20 to 30 billion transistors on a chip, he predicted. This will overcome many of the limitations presented by working at atomic level on chip design.
Moore admitted he was a sceptical about whether nanochips would replace tradition integrated circuits. He said the semiconductor industry had invested $100bn on the research and development on integrated circuits.
“We are seeing technology developed in integrated circuits being applied in other fields like gene chips to do biological analysis much more quickly," he said. Another example he pointed to was microfluidics – a small chemical laboratory built on the same technology as integrated circuits.
Moore’s article was published forty years ago this week and predicted how chips would develop between 1965 and 1975. It even predicted the growth of home computers. He said 1965 was “early days” for integrated circuits, which were utilised principally in military systems, and he recalled that such systems were too expensive to permeate the industry.
“I wanted to get across [the message] that integrated circuits would be the route to cheap electronics,” he said.
At the time the article was published the most complex production integrated circuit had about 30 components. “In labs we had 60 component integrated circuits,” he said. “We doubled complexity every year and extrapolated every year from 60 to 60,000 components – by making complex circuits.”
Moore was surprised by the accuracy of his predictions. He said, “I frankly didn’t expect [Moore’s Law] to be at all precise – but it turned out [to be] more precise.
In 1975 Moore looked at why his prediction had proved accurate and found three key drivers to be greater chip density, bigger chips and reducing unused space between transistors on chips.
By making more complex circuts with smaller dimensions, costs have continued to drop and performance increased, Moore said.
In order to compete “[Moore's Law] has been a self-fulfilling prophecy as industry realises it needs to keep on this path”, he said. He conceded, however, that he was concerned that software has not kept up with chip development. “Software does seem to lag behind hardware development. It’s a real challenge but I think the industry has done well.”
But, Moore said, the challenge remained. As the user interface increases, complexity seemed to grow in order to take on board new functionality. “We seem to be losing ground on usability,” he concluded.Click here to read the original Gordon Moore article >>
This was first published in April 2005