The IT industry relies on breakthroughs in hardware to innovate in software.
These breakthroughs are governed by a far-reaching insight predicted by Gordon Moore, the co-founder of Intel in 1965. Bernie Meyerson, IBM's vice-president for Innovation believes Moore’s Law is no longer possible.
Meyerson led the development of silicon germanium and other high-performance technologies over a period of 10 years, and in 2003 he assumed operational responsibility for IBM’s global semi-conductor R&D effort.
He says: "Gordon is a genius. Not many people will come up with something in their lives that holds true for many decades." But the chip technology underpinning Moore's Law has changed and Meyerson feels that the law has been taken to the limit.
Through Moore’s Law chip designers can double the density of what they put on the chip every 18 months. They can manufacture chips that have twice the capability because they have twice as much material to work with.
But if the density of transistors on the chip keeps doubling, eventually there will be a million times as many transistors on the chip. If nothing is done to tackle power and heat, a chip that started off at 10 watts, will consume a 10mw chip, says Meyerson.
More articles on semiconductors
- Video: IBM fellow Bernie Meyeson discusses IT beyond Moore's Law
- How IBM researchers hope to change the world
- Interview: Steve Furber on UK chip innovation
- AMD and ARM rewrite datacentre computing
"If you turned on your laptop it would provide a brief but unbelievably exciting experience by catching fire. But if you set fire to the user you probably don’t have a lot of repeat purchases." Clearly the chip industry does not work this way.
A researcher called Bob Dennard invented the theory of chip scaling, which has allowed chip designers to continue doubling the number of transistors on a piece of silicon without the risk of frying the user. So following Moore’s Law, every 18 months there is a new generation of chip with twice the number of transistors, but the power remains the same as the previous generation, Meyerson says. Dennard's recipe works up to a point, and that point has now been reached, according to Meyerson.
Miniaturisation goes nuclear
Increased miniaturisation cannot go on forever. "The problem is rather like folding a piece of paper. You may be able to fold a £50 note eight times but short of bringing along a small nuclear device, you will not be able to fold it any further," says Meyerson.
"You do realise that if you keep halving the size of the transistor, eventually some of the layers in the transistor are one atom thick. When you cut them in half again to double transistor density that is nuclear fission. Let me know when you’re going to do that so I can be far away.
"Moore’s law, in terms of how you scale the transistor unfortunately died about a decade ago. Just because you can make twice as many transistors on a chip does not mean you can manufacture the same way you did 30 years ago, when chip designers simply made chips half as big."
But according to Meyerson, by 2005 there were already parts of the chip that measured just a handful of atoms in thickness. When the constituent parts of a transistor are that thin, they behave differently, he adds.
If you keep halving the size of the transistor, eventually some of the layers in the transistor are one atom thick
"When the insulating layer (Silica – Silicon Dioxide) becomes only a couple of atoms thick it becomes quantum mechanical." Given that it is an insulator, making it half as thick, does not make it twice as conductive, but 10,000 times more conductive. The phenomenon is called Fowler–Nordheim tunnelling.
"Silicon Dioxide, which has been used for the last 30 years, can no longer be used as the insulator, so now if you take apart a transistor you discover it is not there." Instead, the semiconductor industry has moved to new insulator materials - Hafnium oxide.and Hafnium Silicate. These are very different materials and cannot be scaled in the way that Dennard had specified for silicon chips."
So now with each new generation of semiconductor, the chip gets worse. Meyerson says: "They actually run slower, get hotter and burn a tonne of power."
The industry is having to invent new techniques, processes and structures. Chip fabrication technology is now at 14nm. "Two or three generations of chip from now and silicon itself goes quantum mechanical and stops behaving as it would in a normal transistor and you are done."
Meyerson describes this as the post-silicon era of IT. However, he does not expect technology to advance to a stage where a viable alternative to silicon can be used in mass production. "Silicon is not going away, but in the post silicon era, there will be no benefit in that it won’t necessarily be any faster, cheaper or better at the transistor level," he says.
He predicts that new types of computing will emerge, such as cognitive computing. For instance, the architecture inside IBM’s Watson machine that won US games show Jeopardy. It is synaptic, like the human brain.
Limitations of comms will also need to be tackled. The speed of light is completely inadequate, says Meyerson.
We’ll eventually take a rack of computers and put them in a chip that is just 2.5mm high
According to Meyerson, by the time an IBM machine completes one machine cycle (ie executed a single machine code instruction), light will have travelled around 10cm. In a datacentre the size of a football pitch, if one computer queries another computer on the other side of the field, the computer asking the question would idle for over 18,000 machine cycles before it gets the answer (assuming the second server is about 100 yards or 91 metres away).
He says: "You have to address problems you never had before. So after years of making datacentres bigger, you now find organisations making them smaller simply to reduce the time it takes to communicate. We’ll eventually take a rack of computers and put them in a chip that is just 2.5mm high. There will be progress, but not through Moore’s Law."
Meyerson believes there is a huge amount of IT innovation that can be achieved through integration. "In the past, 20% of the gains in IT each year were technological, but the other 80% came from software and integration."
In the post silicon era, the 20% boost that Moore’s Law gave by doubling the number of transistors on a chip every 18 months, will no longer be possible, and will have to be made up in some other way.
He expects complex software systems and specialised processors such as field programmable gate arrays (FPGAs) and GPUs (graphics processing unit) will see more mainstream usage to allow the IT industry to continue to yield step improvements in performance.