Hardware: It's how you use it that matters

Feature

Hardware: It's how you use it that matters

filename

Social factors rather than technical advances are likely to dictate what technologies will take prominence in years to come. Martin Banks examines how future hardware developments will intertwine with what users actually want

39105_flyhardware75.jpg

History has a habit of repeating itself, and most of what is seen as coming down the line has been here before, at least once. The second comings will undoubtedly be done better, and certainly more appropriately, than before.

They are often ideas that were ahead of their time or pitched at the wrong markets. Some were sound then and can be reborn as something sound for the future. In fact, much of the future will revolve around how technology is used, rather than what new technologies appear.

As Steve Prentice, vice-president at analyst Gartner, put it, “There is an intersection between technology and society now taking place, and all the technologies that need to be deployed are already available today, so the basis is in place. The most important aspect now is acceptance by users.”

This implies that hardware technology innovation can no longer be considered an end in itself. The ways it is used, and the software available for it, are the most important guides to the future.

When it comes to semiconductors – and microprocessors, in particular – we have seen Intel, and latterly AMD, establish an almost unbreakable hegemony in the business server market that not even the more specialised reduced instruction set computer (Risc) processors could dent significantly. But here, even the PC was built as an IT system, and now new, non-IT markets are opening up.

As IBM distinguished engineer Mark Cathcart says, “The emerging markets are different. In areas such as smart devices, high-definition TVs, medical imaging equipment and voice over IP, are AMD/Intel likely to be as dominant? I do not think so.”

There are, of course, new semiconductor technologies coming down the track and it is in these new, different application areas that they are likely to bring significant benefits.

“As far as raw technology is concerned, I expect to see molecular and photonic technologies coming along to take us beyond current semiconductor technologies, which will be good for another 10 years or so,” says Prentice. He expects the overlap to occur somewhere around 2030.

Prentice also expects to see a much broader range of services on board. “We will see more analogue circuitry on chips, particularly wireless capabilities. Analogue capabilities used to be expensive in semiconductor technologies, but now it is possible to do much more of it.”

He also expects to see chip voltages drop under 1V, heralding a move towards quantum-level activity in processors.

While the underlying technology continues to progress dramatically, the benefits to the user so far are questionable. According to Iann Barron, a big name from the early semiconductor industry and co-founder of Inmos in the late 1970s – with a then staggering £50m of UK government funding – the story of the processor is a sorry one. Why? Because it has delivered less than 10 times the useful capability that it did 15 years ago, in spite of greatly increased clock rates and silicon area.

“The problem remains the von Neumann bottleneck, which puts a narrow channel between memory and processing. With developments like pipelining and cache memory, a processor chip is now mostly memory,” Barron says.

He feels that even the arrival of multicore chips means the effective gain in throughput is little more than the underlying improvement in transistor speed.

Contrast this with graphics chips and you get a clue as to where Barron thinks the next real step forward will come. “These consistently deliver increased performance by migrating specific algorithms to hardware and increasing the level of parallelism by array processing instructions.”

Barron points out that the latest generation of graphics processors are now being used for other compute-intensive problems such as Fourier transforms. “The lesson is immediate. If we can raise the level of parallelism in programs, we can build faster computers.”

Barron’s view is supported by Prentice, who expects to see much more redundancy built into processors. “Parallelism is likely to be the next big thing in processor technology. It is the way to increase performance. The beginnings are already well established and the latest is IBM Cell technology,” says Prentice.

It is worth noting that IBM recently announced that the next most powerful supercomputer will be built using Cell technology.

Prentice also sees a future with run-time configurable devices that become what the application requires at the time it is required.

What is interesting here is that the technologies being predicted have all been tried before – many years ago. UK inventor Ivor Catt demonstrated wafer-scale on-chip redundancy in the 1970s, and the Inmos Transputer proved that parallelism works. The 1970s AMD 2900 chip set gave configurable devices, but not the run-time element.

The links between hardware and software are now too intricate to be disentangled. One growing area is the incorporation of application code directly into the processor. Both Prentice and Barron identify security as an area with real scope here.

To achieve the level of security necessary for future systems, Barron suggests it will be necessary to carry encryption down to the chip level, with all chip interfaces encrypted.  This means information would be encrypted at the word level, and the processor would have to encrypt and decrypt information across the memory interface.

Barron also sees copyright as an increasingly important problem. “If information is to have value, access and copying must be controlled,” he says. “Information needs to be tagged with its ownership, and processors need to get explicit permission to access information and for the act of copying.”

As for new applications, Barron’s favourite suggestions are electronic money and a 21st century transport system, with fully automated cars driving themselves around our present road system, with greater safety, improved travel times and traffic densities, and the removal of all the information and lighting clutter.

He thinks new applications will also require significantly different approaches to software, particularly approaches that map onto the predicted rise of hardware parallelism.

“Extracting parallelism from existing sequential programs is not effective, and it is not until the programmer is able to explicitly use and control parallelism that real performance gains will be made,” Barron says.

“Contrary to popular prejudice, explicit parallel processing is not more difficult than sequential processing, and I still look forward to the day when parallel constructs are as common a feature of programming as conditionals.”

One area where parallelism is likely to be used is in the post-PC-centric world predicted by Microsoft’s chief software architect Ray Ozzie. This is where the execution of services take place within “the cloud”.

There is an underlying cultural change that is also creeping up as part of this change, says Prentice. “Consumers are willing to use remote services such as storing pictures on Flickr, and that is the way to go.

“The trouble is that in the West we are still into the mindset of owning data, so we need huge disc drives of our own. In the West the PC will not go away, despite there being better solutions.” This may mean that other cultures will seek to grab software leadership here, particularly in the enterprise world.

The “cloud” is also an increasingly important component in the future, when communications will be a core capability. Allan Russell, senior vice-president for strategy at software supplier SAS Institute’s international division, says, “We have moved from DP to IT to ICT and the ‘C’ implies that devices now do more ‘connecting’ than ‘calculating’. The ability to digitise and reconstitute artifacts such as sounds, pictures and objects becomes much more useful if you can then time or location-shift the items concerned.

“This has changed SAS. We used to spend our time thinking about how to create content and now we are concerned about how to disseminate and embed the content created.”

One of the more easily predicted developments for the future is the long-term drift towards utility-based delivery of services. As Prentice says, “The problem in the enterprise is no longer the cost of buying, but the cost of management.” The natural consequence he sees coming from this is a move to thin clients and a utility-based future. The arguments against this will, however, again be cultural rather than technical.

“The concept of hotdesking is the way to go, but this already hits the problem of, ‘If I do not have a desk, then I am not important’,” says Prentice. “We also need to get people trusting external storage rather than their own. Arguably, external storage is safer – who regularly backs up their own drives? And the technology for seamless background back-ups already exists.”

Cathcart agrees. “I buy into a lot of the service functions for things we traditionally used the desktop for. Increasingly, I do things over the net now that I would never have considered possible before. My Tre650, for example, is backed up over the internet and my bookmarks are on del.icio.us.

“I think it will be the ability to integrate and absorb Web 2.0 services for business use that will drive the utility aspect. After all, how many companies could afford a googleplex, even a mini-one, to do internal indexing? Why not farm all those functions out via a service?”

Although the technology needed for the future is already largely in place, IT is now so embedded in everyday life that it will be external events that will drive which technologies come to the fore, particularly in enterprise.

As Prentice says, “The application of technology will be driven by events such as terrorism, Sars or an equivalent, or a healthy tax-break. These will now be the triggers for change.”

The SOA future approaches >>


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in November 2006

 

COMMENTS powered by Disqus  //  Commenting policy