kentoh - Fotolia

CW@50: The Great British chip invention

There is a link between the world’s first working computer and the world’s most successful chip: they are both British

This article can also be found in the Premium Editorial Download: Computer Weekly: Computer Weekly 50th anniversary special

On 21 June 1948, the world’s first stored-program computer ran its first program. The Manchester Baby eventually evolved into the Ferranti Mark I.

Acorn used semi-custom chips from Ferranti in the design of the BBC microcomputer, which gave the home computer company the skills and tooling it would later need in the design of its own microprocessor, a device infinitely smaller than the Baby.

One of the hallmarks of the British chip industry was the relatively small size of the UK players compared with the global semiconductor giants. An example is Inmos, a semiconductor company based in Bristol, which developed an innovative microcomputer called the Transputer.

The first system on a chip

According to Transputer chief designer David May, Inmos was a full-scale semiconductor-manufacturing organisation.

“[It was] part of industrial policy of the then Labour government – through the National Enterprise Board – to revitalise industry,” he says, referring to the £50m grant Inmos was given to start up a UK semiconductor industry.

The original idea stems from some of the work Inmos founder Iann Barron had done on a minicomputer called Modula One. Barron, who originally designed the Elliot 803, had the idea of a modular minicomputer with a separate processor, memory and peripherals, that could be configured flexibly.

“We wanted to put an entire computer on a chip, recalls May. It was a fairly big task, but we used an extremely simple design which we could then integrate entirely on to one chip. We even tried to push it on to a quarter of a chip.”

The Transputer was the first device chip that was an integrated computer on a chip. It was also the first microcomputer to support parallel computing.

But former Computer Weekly microchip columnist Martin Banks argues that UK chip companies lacked the funding to serve the global semiconductor market.

“The Transputer could have been a good idea. The biggest problem was that there wasn’t any appreciation for the semiconductor market,” he says.

According to Banks, the UK government grant was the bare minimum to establish a wafer fabrication (fab) plant. The Inmos fab was based in Newport.

“You could spend that amount of money just on the design,” he says. In fact, adds Banks, wafer fabrication was the challenge that let most companies down. “Scaling chip production to a mass market is expensive, so the Transputer was very specialised.”

What makes the Transputer noteworthy is that it was engineered from the bottom up as a scalable parallel computer architecture.

Thirty years ago parallel computing was amazingly hot stuff, Banks tells us, but “nobody was really thinking about it”.

The likes of Intel, Motorola and National Semiconductor – giants of the global chip industry – did not make parallel computing architectures. And without the major players raising awareness, Banks says the technology struggled to gain wide acceptance.

For May, parallel computing remains a problem for programmers today. He says procedural programming is a series of instructions, so it is relatively easy to understand.

“The problem is when you do parallel programming, you have to think of a collection of components passing information between one another,” explains May.

Besides microcomputers such as the Transputer, in the 70s and 80s UK firms dallied with various chip designs.

According to the The National Museum of Computing (TNMOC), companies including ICL, Plessey, Elliott and Ferranti did early microcircuit developments, working on logic gates, rather than full-blown microprocessors.

Among the world firsts for the UK was Ferranti’s uncommitted logic array (ULA), a kind of application-specific integrated circuit. The TNMOC said this was used by Professor Andy Hopper at Cambridge University in the design of the famous Cambridge Fast Ring – a network supported by Acorn Computers and Olivetti Research, which offered 15Mps point-to-point LAN connectivity.

“Early ULAs were still laid out by draughtsmen on tracing paper then photographically reduced to a mask,” recalls TNMOC’s Andrew Herbert. I remember Andy Hopper using etched silver wires and a pair of manipulators borrowed from the biology labs to test his Fast Cambridge Ring ULAs in around 1980.”

Rather like today’s field programmable gate arrays, in which the logic circuits can be wired after the chip has been manufactured to run a specific task, Banks says the logic circuits in the ULA were not connected.

Ferranti worked with manufacturing companies to customise the ULA by connecting it in different ways. To achieve this, Banks explains that Ferranti had a small manufacturing plant in Manchester.

“It took over a few cotton mills to make these devices. But there was a lack of good investment to make a ‘stonkingly’ good manufacturing facility for the ULA,” he says.

In spite of this, the ULA did find its way into products and, some experts believe, set the foundation for Britain’s most successful semiconductor company.

“The ULA was our first toe in the water for chip design,” says Steve Furber, who at the time was the principal designer for the BBC Micro at Acorn. “We used two on the BBC Micro, one for video handling and the second one for handling serial processing.”

For Furber, the use of the ULA influenced Acorn to invest in chip technology and tools from VLSI technology, laying the foundation for an audacious project: the ARM microprocessor.

How did ARM start?

Entrepreneur Alex van Someren, who co-founded nCipher and now works as a managing partner at Amadeus Capital, says he was lucky enough to work at Acorn as a youngster. After applying for a summer job when he was 13, the company’s founder, Herman Hauser, gave him a new home computer Acorn was developing, to try to program.

“I used the Acorn Atom, the predecessor of the BBC in 1980 and I worked at Acorn until 1984,” says van Someren.

For him, this period of the early 1980s was the boom time for computing around the world. “Apple brought out the Apple 1 and 2, the BBC Micro was very successful, and Acorn built a line of business that ultimately became ARM.”

Acorn was doing remarkably well with the BBC B microcomputer in the education market. Following this success it went on to release the Neuton. But while the first generation of home computers ran off 8-bit processors, a new generation of higher performance 68000 16-bit machines were coming to market.

According to van Someren, Acorn looked at the new Intel and Motorola processors, but “none were as efficient as the 6502 chip Acorn was already using”.

Meanwhile in the US, Stanford University and the Massachusetts Institute of Technology (MIT) began developing alternative microprocessor architectures – known as reduced instruction set computers, or Risc.

In 1985 Furber, who became lead designer for the ARM microprocessor and Sophie Wilson, who designed the ARM instruction set, were in the US looking for a replacement chip for the Mos 6502 that powered the BBC B.

“We went to see how the successor to the 6502 from Mos was being designed, says Furber. We expected to find a typical American company, but the design work was being done in a suburban bungalow in Phoneix. It was a cottage-industry feeling and we felt if they could design a microprocessor, we could too.”

Acorn decided to leap a generation of microprocessor architectures from 8-bit right up to 32-bit with the Acorn Risc Machine, later to be renamed Advanced Risc Machine, or ARM.

According to van Someren, “it was a calculated risk to develop its own Risc chip, but Acorn knew it needed another product”.

Apple needed a partner it could work with, so we ended up with the ARM core that has Apple tech around it
Jason Fitzpatrick, Centre for Computing History

The new chip powered the Acorn Archimedes, regarded by many as a state-of-the-art computer of its time, which competed with popular Atari and Commodore machines.And Acorn would have very much remained a home and education computer maker had it not been for a happy coincidence.

“While there was quite a lot of choice for desktop microprocessor chips, there were not many low-powered devices, apart from ARM,” explains Centre for Computing History curator Jason Fitzpatrick.

The Risc architecture used in the ARM processor effectively meant it was a simpler microprocessor, compared with the so-called Cisc (complex instruction set) chips. “This low power has huge a impact on battery life,” adds Fitzpatrick.

Now Apple desperately needed a low-powered processor for the heart of its latest and greatest invention of the time: the Apple Newton personal digital assistant.

“It needed a partner it could work with, so we ended up with the ARM core that has Apple tech around it,” says Fitzpatrick.

The only problem for Acorn, according to industry watchers, was that it was a big rival to Apple. This was solved by spinning off ARM as a 12-person company headed by Richard Saxby. ARM was then able to form a joint venture with Apple to licence its processor technology – now called the ARM core.

“Arm decided to licence its technology, says Banks. It knew it couldn’t produce chips for a worldwide market.”

But the 12-person operation could supply chips to Apple. The Arm model was to do the design and architecture for the Risc microprocessor, then sell the design to third-party manufacturers to customise it.

“No one did IP licensing in those days,” says ARM co-founder and chief technology officer (CTO) Mike Muller. “There was also a degree of luck and timing.”

Specifically, the embedded systems market was starting to blossom.

“Moore’s Law was allowing people to combine multiple chips into a single chip, but if you didn’t have a microprocessor, you wouldn’t win,” says Muller.

And today, he says, ARM has more processors in devices than any other chipmaker, with embedded technology in everything from anti-lock brakes to smartphones and servers.

The idea of licensing IP, which was pioneered by ARM is now called the fabless semiconductor industry. The actual manufacturer, or fabrication of the chips is outsourced to semiconductor foundries.

Echoes of the Transputer

Transputer chief designer May says the UK’s expertise in embedded system on a chip architectures, which began at Inmos, combined with parallel computing strengths from academia, are coming together in new and innovative ways.

May has taken the original Transputer ideas and moved them into the 21st century, creating Bristol startup microcontroller company Xmos. This embedded system on a chip has found a niche in digital processing for audio.

“If you have USB 2 audio to connect in headphones or a digital to analogue converter, then there is a good probability it will use an Xmos device for real-time digital signal processing.”

According to van Somersen, multi-threaded computing in Xmos shares much in common with the approach the original Transputer took.

“Its biggest market opportunity is in high-quality microphones for voice control,” he says.

The parallel processing on the Xmos is able to handle multiple audio feeds simultaneously, enabling it to clean up background noise, which can then be fed to a cloud-based voice recognition system. The Amazon Echo virtual digital assistant uses this technology to provide voice control and querying.

Read more CW@50 stories

Today, companies such as Xmos are among the cluster of UK semiconductor startups operating around Bristol and Cambridge. While UK manufacturing has been in decline for the past few decades, the UK chip industry seems to be doing remarkably well.

Not only is the world’s most widely deployed microprocessor designed here in Britain, but embedded-system-on-a-chip design is becoming a repeatable UK success story.

Read more on Chips and processor hardware

CIO
Security
Networking
Data Center
Data Management
Close