Processor fabrication technology will change dramatically in 15 to 20 years. But until then, companies will have to squeeze more and more from their silicon. Danny Bradbury reports.

Processing power is everything in today's computing environment. The likes of Intel and Sun Microsystems invest millions of dollars in research and development to make their chips faster than their rivals'.

And thanks to their efforts, Moore's Law, the prediction made by Intel founder Gordon Moore, has been proved correct. In 1965, Moore commented that the capacity of memory chips doubled every 18 to 24 months. This has proved consistent in the processor world, where the number of transistors on a chip has grown exponentially with each generation.

One of the biggest issues for processor manufacturers is the difficulty of fabricating smaller components on a chip. The size of the processor is limited by the demand for smaller computing devices, which is greatest in the consumer-computing sector. Consequently, to fit more transistors onto a single processor, they must be smaller and so must the connections. This is easier said than done.

Processors are produced by a technique similar to photography called lithography. Manufacturers produce an image of the electronic circuit that they want to fit onto the silicon chip. They then shine a light through the image onto photosensitive material on the surface of the silicon. The resulting image is then etched into the surface of the silicon. This is done several times until a multilayer grid of electronic circuitry is built up on the surface of the chip.

Producing smaller components involves expensive refinements to the process and scientists also have to contend with the laws of physics, which make it difficult to produce components smaller than the wavelength of light used to make the image in the first place.

Consequently, many manufacturers will tell you that the cost of fabrication is increasing dramatically, leading to a consolidation in the market. This is certainly the view of the California-based PriceWaterhouseCoopers (PWC) Technology Forecast. The report, which includes a section dedicated to processor technology, says more companies will find it uneconomical to operate their own facilities and will outsource fabrication to specialists.

Malcolm Penn, managing director of specialist semiconductor market research company Future Horizons, says,"In absolute terms, fabrication is more expensive, but relative to company revenues we spend the same. It's no more expensive to build a fabrication plant now than it was 30 years ago."

Furthermore, we are unlikely to see a quantum leap in processor fabrication methods within the next two decades. Many new methods of processing have been proposed that depart drastically from conventional methods. But firms are doing everything to squeeze more functionality out of existing processes and they will be able to do this for some time. The PWC report says lithographic fabrication will continue until at least 2014, while Penn predicts we won't see advanced methods used on a commercial basis until 2050.

Meanwhile, companies will have to be content with enhancing current methods of fabrication and making changes to the logic control software hard-wired into the processors. Firms have already moved from visible to ultraviolet light, which has a shorter wavelength, as a means of producing more detailed electronic circuitry on a processor. Other light sources such as extreme ultraviolet, X-ray lithography and electron beam lithography are also being explored.

Extreme ultraviolet could produce line widths much smaller than those available today and they could make the chips up to 100 times faster, according to the PWC report. X-ray lithography has been successfully demonstrated by IBM, while electron beam projection uses electrons instead of photons in the lithography process.

Firms are also using new materials to create faster processors. There are three well-known technologies that have already produced commercial products:

  • Silicon on insulator - processors are slowed by the capacitance (the ability to retain an electrical charge) of the transistor. The greater the capacitance, the longer it takes to alter the charge within a transistor and open or close the electrical gate that the transistor creates. Insulating the transistor from the silicon substrate reduces the capacitance and enables it to switch faster.

  • Copper - this has a much greater conductivity than aluminum, which is traditionally used to hook together transistors on a silicon chip. Because copper conducts electricity more efficiently, electrons travel along it faster, leading to a faster processor. IBM has been making great strides in this area.

  • Silicon germanium - this uses conventional silicon as the substrate for a processor, but employs silicon mixed with germanium as the basis for the transistor material. Germanium can operate at a lower semiconductor bandgap - the difference in charge between an "on" and "off" device. Consequently, silicon germanium chips can be faster and/or more power efficient. It is becoming a mature technology in the commercial sector, following IBM's commercial introduction of devices based on the technology in 1998.

    The other way of creating faster processors is to use more intelligent hard-wired software on the processor, enabling it to handle instructions more effectively. The most recent development in this area is explicitly parallel instruction computing (Epic). This extends an existing feature of modern processor design, in which the processor carries out multiple instructions at once, by using different pipelines down which instructions can be sent.

    Unlike previous parallel instruction processing architectures, an Epic-based architecture can add more pipelines in subsequent generations without rendering itself incompatible with existing software, according to the PWC report.

    The first Epic-based processor family is the IA-64, a result of joint work between Intel and HP. The first processor in this series, called Itanium, was supposed to have shipped last year but it has been delayed. It will be followed by McKinley at the end of this year. Like Itanium, McKinley will be targeted at the high-end server market.

    In the short to mid-term, the future for processing technology will focus on producing more densely-packed silicon surface areas. Companies will use enhancements to existing fabrication technologies to create more complex chips with greater numbers of transistors, and technologies such as Epic will result in more intelligent instruction processing without losing software compatibility.

    The next quantum leap in processing technology is unlikely to happen for at least 10 to 15 years, at which point we will have exhausted the potential for improving lithographic technology. Then, the transition from painstaking enhancements to dynamic exploration will begin.

    Goodbye to Intel?

    Fabrication processes: how small could you go?

    While conventional fabrication processes are likely to be around for the next decade at least, the technology to support future processors is already in the labs. Molecular computing involves the manipulation of logical components created from atoms. Hewlett-Packard has been heavily involved in this research and, in July 1999, scientists at the company's laboratory managed to engineer a molecular-level logic gate. This would do away with lithography altogether.

    With some imagination, it is easy to think of numerous innovative applications for molecular computing. Because you are dealing with another form of nano-technology, it is theoretically possible to build a supercomputer into your nail polish or tattoo a mainframe onto your body. Injecting tiny computers into your body to locate, analyse and neutralise cancers or viruses could be an option.

    But let's not get ahead of ourselves. Philip Kuekes, a research scientist at HP labs, explains there are many obstacles to be overcome before molecular computing becomes a commercial reality. For one thing, getting information to flow through the matrix of molecules is no mean feat.

    "The limiting aspect of almost all these technologies is not how fast you can sense or flip a bit," Kuekes says. "It's how fast you can get the information to the outside world if the bit has one million neighbours. You have to think about signaling down those wires."

    Quantum computing is even more complex than molecular computing because it makes use of the Superposition Principle, which only works at quantum level. This principle states that some quantum particles only exist in a certain state when observed - if they are not observed, then they can exist in multiple states at once. Because the state of a bit represents a number in computing, this makes it possible to perform calculations on different numbers at the same time.

    Quantum computing has huge potential and could change life as we know it. But it is also going to take decades before, or indeed if, it becomes commercially available.

    Finally, the last Star Trek-like processor technology is DNA computing. Scientists have proposed that, because DNA is already used to store information, it could store and process information put there by programs.

    How fast chips could help in daily life

    Rodney is unhappy. He is in a foreign country on a business trip where he cannot speak the language and his multilingual assistant has been taken ill. Also he has to process an immense amount of marketing information before he can make a presentation to a potential client. Luckily, Rodney has a personal digital assistant with a low-power, high-speed processor that has enough capability to process language information. Trying to negotiate with a hotel concierge, Rodney holds up the palmtop, which listens to the concierge's message, translates it and relays it to Rodney in his language.

    But he still has to process those sales figures, which involves analysing the telephone call statistics of hundreds of thousands of telecommunications customers, and then cross-referencing them against customer salary information. There are just five minutes before his presentation is due to start. He calls the office and asks a colleague to run the numbers. Until a couple of years ago, it would have taken hours to produce the results. Now, it takes a few seconds because the quantum computer sitting on his colleague's desk uses quantum uncertainty theory. This puts computer bits in two states at once - both on and off - so calculations can be made on multiple numbers at the same time. Strange, but true. And very useful for Rodney.


  • Email Alerts

    Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
    By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

    This was first published in February 2001

     

    COMMENTS powered by Disqus  //  Commenting policy