Software trends for the 21st century

Changes in processor architecture will lead to a transformation in software design, according to Andrew Herbert, managing director of Microsoft research, Cambridge.

Changes in processor architecture will lead to a transformation in software design, according to Andrew Herbert, managing director of Microsoft research, Cambridge.

In a lecture organised by the Royal Signals Institution and BCS, he said software practices would become obsolete this century.

“The 20th century was all about getting as much as you could out of the processor – but we now have riches of processing power, and software is no longer constrained by hardware limitations.”

Herbert said uni-processors were being replaced by processors with multiple CPUs, which would become the norm because modern cache sizes needed more power.

“A large cache on a single chip is the only way to improve performance when Moore’s Law runs out.”

He said there was no choice but to move towards parallelism. But attempts to tease the parallelism out of a sequential program automatically had not worked, so the problem of concurrent programming had to be addressed – and better languages were needed.

Languages such as C+ and C++ were often unreliable and caused memory leaks. Addresses were too big, so worms could exploit them. Low-level languages had errors in handling array and heap storage.

But Java and C# did not have many such errors, said Herbert. Until recently, Java and C# were considered too expensive in terms of space and time. He advocated leaning towards languages of automatic storage management.

At one time, virtual memory was used with paging to compensate for small and very expensive mainframe memory, he said. But now paging slowed down the system and memory was cheaper.

Herbert said the testing needed to verify software had its limits. As projects got larger, less time was spent on coding and more on debugging, documentation, and support and management.

“More formal methods are needed to improve the quality of new code and help maintain old code. This is possibly thanks to increased processor power and memory capacity, along with developments in model checking and theorem proving.”

Storage potential had grown hugely, including personal file systems such as the iPod. Herbert reckoned that by the time he retired, everyone would be carrying round a terabyte. “Everyone will carry their own personal archive and that data will also be stored on the web”.

He added: “We will need a new way of organising things. The historical structural way of organising folders is already breaking down.” Research into digital tapestries could help with object recognition and be used for videos and images.

Microelectronics-based screen and projection technologies might be combined with wireless networking, increasing processing speeds and reducing costs, he said.

“We can envisage a future in which we combine new display formats with machine perception techniques that allow input via handwriting, gesture, touch, speech, or placement of physical objects to create interactive surfaces.”

Herbert said he expected ‘intelligent’ applications to gradually overtake artificial intelligence.


Read more on Business applications

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.