No sign of Yesterday - Part I

| No Comments
| More
As we move ever forward into the future, are we hemorrhaging important information from the past? Robert Dowell, a volunteer at The National Museum of Computing at Bletchley Park looks at the challenges of keeping information alive long enough to save it for future generations; before it is lost to the past forever.

A constant theme at The National Museum of Computing is how to conserve both the hardware and the software. We also have many requests from people asking if we are able to retrieve their precious data from some old media that is gathering dust. In most cases we have been successful, but in some rare cases the data cannot be extracted mainly because the originating hardware is either hard to find, or long since dead.

Take for instance a recent request to extract some data off an old 8 inch floppy disk. Finding a variety of disks was not the problem and neither was creating a hardware setup that could theoretically read it. No, the essence of the problem was trying to make sure we had all the information - and then meeting the expectations of the original author. It seems that a person's memory of what was on some of this stuff is as flaky as the media that stores it.

Another recent attempt has been to locate a drive that was common in 1996 but is all but gone now. We have something that is similar, but due to the type of heads in the device and the loading mechanism, we are unwilling to take a risk of destroying vital parts of the data; and we have yet to locate the correct drive. I am sure that with care and time we will find both the drive and successfully extract the data; but this problem is set only to get harder.

As if to illustrate this problem, the other day I was looking through some old computer magazines I have, when it struck me that all this information stored on old computers was set to vanish if nothing was done soon. Back during the 70's and early 80's, hardware able to read such data was easily available. However newer systems mean that the old data is invariably left in 'storage'. And the number of people with the knowledge of how to operate this equipment become fewer as time moves on. So, even if we do manage to locate the hardware, there is no guarantee that we will be able to understand how it works..

The chance of retrieving old data decreases with each passing year. Despite our best efforts to save the past, figures are far from encouraging as each day more data is added to the mountain that is already out there.

To put this in context, Eric Schmidt, Google's CEO, said at a technology conference in 2010, that every two days we generate as much information as we did from the Dawn of time to 2003. That alone should make us stop and think about the problems we have in keeping the past alive, and a little more judicious in what we should try to keep.

Robert Dowell is a volunteer at TNMOC and a photographer by profession. You can see some of his photographs on TNMOC publications and in the media.


Turing's SatNav...Is it more intelligent than a chicken?

| No Comments
| More
With the recent news of Bletchley Park securing Alan Turing's papers, TNMOC volunteer Pete Chilvers recalls how he gave his new SatNav an unusual test, and wondered how intelligent it really was...

With meetings now taking place planning to celebrate Turing's Centenary in 2012, and getting a new SatNav boasting Intelligent Routes (iQR), my mind wandered onto how it would fare in a Turing Test! That is: how does the SatNav compare to a navigator (mate / wife / self / chicken)?

And anyway, what is intelligence in the context of navigation?

Well, I'm reminded of someone's observation of how to show that a dog is more intelligent than a chicken. I was told to imagine a 3-sided cage. Put the chicken 'inside' and its food outside: the chicken will go hungry as it tries and fails to break out. Do similar with a dog and it will, after a while, realise the futility of the direct route and go out the back and round.

On my way to TNMoC at Bletchley Park I naturally turn off the A5 Dual carriageway, going right, and South-ish, and on to the double roundabouts on the old A5 Watling Street. But trying out my new SatNav it sent me off to the left, North, and the wrong way. What?! But then at the next roundabout, sharp-right and on to that double roundabout, but without the 3 sets of traffic lights. Wow! Much faster, and now my regular route. One up for the SatNav - and more intelligent than a chicken?

Later I visited the Cold War exhibition at RAF Museum Cosford travelling to it via motorway. But a nice feature of the SatNav is to be able to ask for an alternative route, and this I requested for a pretty way back to the South. Travelling into a town I was told 'Turn Right' which I did obediently at the traffic lights, and a moment later 'Turn Right' (now going north!). Hmm! Perhaps OK, but a look on the map display showed I was about to 'Turn Left' and be taken round a green in the centre of a little estate and back out to the traffic lights where I would turn right to continue along the original road Southward! I found myself crossing a river bridge and on the other side I was again told to turn right. Ahh! The original instruction meant turn at the second set of lights south of the river. Oh Well! I reckon a person would have told me more clearly, but at least the SatNav found a safe way to turn round rather than a 3-point turn in the middle of a busy main road! Score here - One Each?

But the clincher came driving in a familiar area of a city with traffic getting congested, and personal knowledge of danger spots. As the SatNav gave instructions clearly taking me to suicidal right turns I repeatedly ignored its instructions. At each moment of disobedience it gave 'Recalculating Route...', impressively quickly, over and over to keep up. And then I realised - in spite of all this, it had failed the Turing Test! Unlike a human navigator getting exasperated at being ignored, it had just got on with it - no huffing; no puffing, no comment!

Pete Chilvers is a long standing volunteer at The National Museum of Computing and is often found wondering the corridors passing on his wealth of historical knowledge to visitors and other volunteers alike. He also gives guided tours of Bletchley Park. TNMOC is open Thursdays and Saturdays and most Bank Holidays; Guided tours are available on other days subject to availability. We are also now offering Company Away Days.

Bootnote: Technology does not always work in your favour... The museums postcode actually takes you to an old Bletchley Park entrance which is closed. To find the new entrance you should use the postcode for the local railway station MK3 6DS.

Connecting the dots ... by Robert Dowell

| No Comments
| More

As the future of computing moves ever further into the realms of science fiction, so the importance of the human-machine interface takes new and unexplored roles. Robert Dowell, a new TNMOC volunteer explores the early days of software interfaces and how they affected him.

For me, the summer term of 1979 was largely forgettable, with the minor exception of my first experience with a computer.  It was a hot dusty time of year and I remember walking into what was a largely abandoned language lab, where a small corner had been given over to a mysterious new object, bathed in a warm yellow glow from the summer sun which shone through the windows.

All those who were interested had been told that this was, in time, going to be the new computer room, but other than that, very little else was said.  We, as students, were free to use this machine, as long as we treated the squat orange box in front of an old black and white television with great respect.  Lying next to it was a portable tape deck and a few sheets of badly photocopied paper, scattered around the desk in small disparate piles.

With the exception of the careful instructions that told me how to turn it on, little else made any sense.  Having performed the easy bit, my feeling of accomplishment vanished rapidly, replaced with despair.  On the television screen six words in two lines - '(M)onitor (C)old Start (W)arm Start' followed by 'Select'.  I remember thinking, 'Okay, what is that supposed to mean and what do I do next?'.  Like every other inexperienced child, I feared that if I pressed the wrong key, it would all end in tears.

Being more than a little shy and only 11 years old, I was too embarrassed to ask, so I looked for anything that would help identify a solution to this problem.  It was a case of understanding all the words in the sentence, but having no idea what any of it meant.

Within a few hours I had it partially sorted and like a misty morning that clears as the sun burns it off, my mind became clearer as I typed in my first program:  a game called 'Four in a row'.  With renewed confidence and a sense that I could finally excel at something, I became a regular user of the Compukit UK101. 

At that time, it all seemed such innocent fun as I explored the depths to which the machine would allow my then fledgling imagination to travel. 

As a growing adolescent I never stopped long enough to consider the implications of the technology on the society around me.  How this would not only change the environment I lived in, but also how it would shape my use of that technology. 

As I grew up, I watched the plethora of new computers appear and like many others I wondered how much more could we have in the same sized box, or how much faster it could go.  Like any technological undercurrent that excites those in its vicinity, we all got swept up by the wonderful vision that was changing, improving, moulding and pushing the hardware harder and faster in directions we never expected.

As our understanding of the human machine interface improved, certain technologies - that in the past decade would have seemed ideal - failed to either perform socially or were found to be lacking due to limitations in the hardware.  It was not the dream that failed us, but our expectations versus reality that brought many ideas crashing down with a resounding thud.

In many ways, the user interface in the early days of computing was a rudimentary link to the cold hard logic circuits within the machine.  As time progressed, the distance between the user and the hardware expanded, separated by the human machine interfaces driven by software.  This has helped facilitate a greater connection to the abstract that allow us to perform ever more complex tasks.

During the expansive years of the 60s, many ideas of the future of computing were formed by visionaries such as Doug Engelbart, who, towards the end of that decade, believed that the tight links between hardware and software would eventually allow us to work in highly creative ways, using abstracts taken from real life. 

Throughout the early years of computing much was governed by the hardware and software limitations. It is even more amazing that both the mouse as a pointing device and the idea of workgroup computing were invented.  Inevitably this led to other directions that built on earlier ideas of how we could interact with a computer.

Earlier still Alan Turing described what he believed to be the 'as yet unheard of' computer and that it could be more than the sum of its parts - based on his belief and understanding at the time.  His basic premise provided an intellectual framework for scientists to work within; thus giving birth to the first digital programmable electronic computer - Colossus.

Early developments in the technology required changes to the way the human-machine interface operated and, as is often the way, necessity becomes the mother of invention.  During the early years, pioneers of the technology, who had been touched by its digital circuitry, had an intimate understanding of the panels, switches and lights that told them everything they needed to know.  To anyone else who looked upon the marvel of the modern age, this made little sense.  If the computer was ever going to come out of the scientific curiosity closet, then it was going to require an interface that was a little more friendly.  As increasingly novel ways of manipulating data, designs and technologies arrived, it led to the simplification and streamlining of the human-machine interface.

All computers have basic similarities, but as we move further away from those early years and into the new millennium, that is where the similarities end.  As software methodology and understanding advance, so will the changes that drive our dreams.  At the forefront of this change, will be the human-machine interface, as it meanders in an unpredictable socially driven future.

The past is littered with attempts to change the way we input data, but unless they offer a tangible improvement on that which we are comfortable with, they will inevitably fail to spark the imagination in out society; Just because something is technologically advanced, doesn't mean we want it.  Technology in today's society is a strange animal, more akin to the age-old conundrum of chicken and egg: Is it the technology that changes society or the society that drives the technological change?

Whatever the answer, the need for simple yet abstract constructs within the software interface will surely mould the society more than the technology.  As we gain a greater understanding of the way we work, and how we can use the tools at our disposal, the creative nature of the human spirit will surely have a greater impact on the future of the human-machine interface than the technology that drives it.

If you don't believe me try to remember, if you can, what it was like before email, the internet, the mouse, high resolution displays, laptops, mobile phones, hard disks, USB and finally the Graphic User Interface.  It's just a case of connecting the dots...

Robert Dowell

 

Back to BASIC

| No Comments
| More

Delwyn Holroyd, a volunteer at TNMOC, asks if there is anything new under the sun.

This week's news from TNMOC about schoolchildren learning to program on BBC Micros from 1981 has caused quite a stir. Although the machine itself must seem hopelessly archaic, they learn that the skills required to program it are the same as for any modern machine. The simplicity of the BBC allows them to really understand what it's doing at a low level, something which is lost behind layers of complexity on a modern PC.

We are all familiar with the saying 'there is nothing new under the sun', but it may seem strange to apply this to computing, when every week there is a new must have system or gadget. We have certainly witnessed incredible progress in the technology of implementation, resulting in storage densities and clock speeds that were almost inconceivable when the BBC Micro was first introduced into schools nearly three decades ago. However if we look instead at the design principles used in modern systems, a different picture emerges.

Here are two examples, but there are many to choose from.

First, virtualisation, the ability to run several logical systems on one physical server, is a hot topic nowadays. The phrase 'bare-metal hypervisor' may have been coined relatively recently, but the technique was implemented on mainframes from IBM and ICL as far back as the 1960s.

The ICL operating system CME (or Concurrent Machine Environment) from the 1970s allowed customers to run two mainframe workloads on one system, and the designers had to grapple with all the same issues of shared access to peripherals and resources that challenge the modern hypervisor designer.

The rationale for CME was different of course: the idea was to allow customers to migrate gradually from their old system onto the (then) 'New Range' of machines. The benefits touted for virtualisation today are ease of management, greater flexibility, reduced cost of ownership compared with multiple servers, and so on. This pitch could have been lifted straight from the pages of a mainframe salesman's training manual!

Second, the mainframe systems of old were inherently unreliable. The number of discrete circuit boards, cabinets, ribbon cables and connectors, coupled with minimal protection against electromagnetic interference ensured that bit flips happened with depressing regularity.

In the early days this would generally lead to a system crash, but gradually designers evolved techniques to cope with the problem. Redundant functional units, parity bits, error detection and correction, and retrying failed operations were all common techniques used.

Advances in technology lead to a much higher inherent reliability and many of these techniques were no longer necessary.

However, IC densities are now reaching the point where a transistor, the basic switching element, can no longer be relied upon to switch reliably. Future IC designs will need to cope with this new unreliability of the basic implementation technology, and designers are turning once again to the old techniques. I hope they remember to consult their old textbooks.

Delwyn Holroyd is the restoration team leader for the ICL 2966 at TNMOC. You can follow his progress on the TNMOC restoration project page.

Popular 1960's British computer still putting on a show

| No Comments
| More
TNMOC volunteer Peter Onion reports on the success of one of Britain's iconic machines, both in industry and as one of the larger, working exhibits at the National Museum of Computing.

This coming weekend at the Vintage Computer Festival at TNMOC, I will as usual be found tending the Museum's oldest original operational computer, an Elliott 803. In fact this weekend, I'll be giving formal talks on it too.

The Elliott 803 was a very popular machine in its day with over 200 being sold between 1960 and 1965 (in 2011 we are hoping to arrange a celebration for the 50th anniversary of the first shipment of an 803B). The 803 was used in a wide variety of business, industrial, scientific and military applications and luckily a list of Elliott's 803 customers has survived, which included:

  • The GPO who used an 803 at their Goonhilly Downs satellite earth station to calculate the path of the first communications satellite, Telstar enabling reception of the first transatlantic satellite TV pictures in 1962.
  • Corah Knitware in Leicester who managed their orders and production on a pair of 803s.
  • A poultry farm in Yorkshire who used their 803 to analyse each hen's egg production to aid their chicken breeding program.
  • Several 803s were shipped overseas with some eventually finding their way to Russia and other Eastern Block countries. Some even went to the USA as part of industrial process control systems!

The TNMOC Elliott 803B was manufactured in 1962 and apart from a period of about 12 years in a barn has been in regular use ever since. It was the first "large" machine to be installed at TNMOC.

Almost every week, I have the pleasure of hearing TNMOC visitors say: "I used one of these when I worked at .......". Some of them were employees of Elliott Automation who installed or maintained 803s and some were students who used an 803 at university to analyse their research.

Of course most of our visitors have never even heard of the Elliott 803, so it is always a pleasure to give them a potted history of the machine. For many of them it is the largest computer they have ever seen close-up - that is until they move on past the 803 and are confronted by TNMOC's huge ICL 2966 installation.

More technically minded visitors can get treated to an in-depth discussions of the finer points if the 803's design. Once explained, its unfamiliar serial architecture and seemingly bizarre 39 bit word length are often described as "clever" or "ingenuous". But those were the days before the byte ruled over machine word sizes, and when the lower gate count and smaller physical size of a serial computer made a machine affordable to many more potential customers. The 803 was the last of Elliott's serial machines with all its successors using parallel architectures.

The ultimate demonstration of our 803 is to see and hear it running its Algol 60 compiler. The compiler was produced by a small team of only three programmers led by Tony Hoare (now Professor Sir Tony Hoare), and the availability of such an advanced high level language made a major contribution to the commercial success of the Elliott 803.

Peter Onion leads the Elliott 803 restoration team at the National Museum of Computing. You can find him at the VCF this weekend in the Large Systems Gallery where he will also be giving talks on the 803 at 12.15 and 15.15 each day.







Restoration: Impossible

| No Comments
| More

With Britain's first-ever Vintage Computer Festival at TNMOC, Bletchley Park on 19-20 June, TNMOC volunteer Delwyn Holroyd speculates that over the next 30 years a computing museum may be forced to become mainly a computer museum.

Whilst hammering some new bearings onto a drive motor shaft the other week, a couple of thoughts crossed my mind. Firstly, how computer restoration can often be more akin to car repair than electronics. Secondly, how fortuitous that it's possible to buy new bearings off-the-shelf for less than a tenner to repair a 30 year old disk drive - one of the 14" removable disk pack types made by Control Data Corporation and used widely throughout the mainframe and minicomputer industries of the 1970s and 80s. Visitors to The National Museum of Computing often comment how much like top loading washing machines these drives look, and indeed the technology is possibly closer to washing machine than the modern hard drive.

It isn't just the industrial nature of the technology that aids the restorer tackling computer hardware of this era - manufacturers used to be a lot more open in the information they provided about their products. The technical manual for this particular CDC drive runs to three volumes, including full schematics, wiring lists, and a reference section written in an agreeable university tutorial style: very useful for repairing the drive, but equally valuable for learning about analogue electronic design techniques.

Fast forward to 2040, and the restorer attempting to get a 30 year old Blu-ray drive working will face some significant obstacles. It will be next to impossible to find any schematics, none of the specialized integrated circuits will be available and to cap it all trying to extract the firmware for analysis will probably be illegal on the grounds of reverse engineering.

We are facing today the culmination of a process that started sometime in the 1980s - a steady shift from 'open source' hardware to 'no user serviceable parts inside'. Should we care? After all, what's special about the PC on your desk that might make it worthy of restoration in the distant future? In many cases it won't matter, but it's notoriously difficult to recognize in the present just what will be interesting or valuable to future historians.

The open source community has been very successful in raising awareness of the benefits of openness in a software context, but exactly the same considerations apply in the hardware world. Increased availability of hardware technical information would ensure more of it survives into the future. It's hard to imagine a scenario where the source code to Linux is ever lost, but you may well have just thrown out the last copy of the technical manual for what will be viewed as a landmark machine in 30 years' time.

If the current trend continues, the computer museum of the future will have far fewer working exhibits. Ironically, the machines from the 60s and 70s will still be going strong (chunky bearings will probably still be available) but that shiny hex-core PC will just be another non-functional box gathering dust in a display cabinet.

Delwyn Holroyd is the restoration team leader for the ICL 2966 at TNMOC. You can follow his progress on the TNMOC restoration project page.

Saint George's Operating System

| No Comments
| More
One of the UK's great workhorses of computing, the ICL 1900 series is now just a distant hum in the background, but its operating system, GEORGE, lives on in many people's memories and should be celebrated today, St George's Day!

The Dragon that GEORGE was protecting us from was the hardware - a developing series of machines that grew and grew taking advantage of improvements in technology.

The design of the 1900 itself dates back to a Ferranti paper exercise developed by Harry Johnson and hence often known as the HARRIAC. This design wasn't used at the time and was passed to Ferranti-Packard in Canada who developed their FP6000 series. The design of the FP6000 later came back to the UK as the basis for the 1900 machines.  ICL demonstrated their first two 1900 computers at the Business Efficiency show in Olympia in 1965 and deliveries started in earnest in May 1965.

Rising above all the hardware was the operating system known as GEORGE or GENeral ORGanisational Environment. Developed by a team lead by George Felton in Stevenage, the system name was a actually a tribute to George himself.  GEORGE would provide a consistent and reliable front-end to the developing machines.

Government support for ICL meant that nearly all UK nationalised companies and very many universities used 1900 machines throughout the 1960s and 70s.

I well remember under-graduate computing in the early 1980s when a deck of punched cards would be passed to the computer centre reception ladies. These guardians, not to say dragons, of the university's 1904S would take our meagre offerings and add them to the job queue. Throughout the rest of the day the job list would be printed and pinned up next to reception - a list more than 2m long by the end of the day.

Of course, the low priority jobs from us under-grads would get pushed further and further down the queue, until just before midnight the job would be run. Typically GEORGE would take one look at the control cards, run it past the compiler, and then spit out two or three pages of errors!

Calling in at the computer centre, typically on the way back from the pub, I could grab a quick look at the error report and attempt to punch a few replacement cards. Submitting the job at that time of night meant it might get another run before morning!

ICL sold 1900 based systems until almost the 1980s, but GEORGE still lived on in the ICL New Range machines - the 2900 series. You can follow TNMOC's restoration of an ICL 2966 by clicking here.

Finally, readers of this blog might like to know that they can get even closer to TNMOC, by joining as a member. There is more information here.

DEC's Legacy in Hardware and Software

| No Comments
| More

Last weekend, I attended the DEC Legacy event in Windermere with Peter and Ben, two volunteers from TNMOC. It was a chance for enthusiasts to show off the fantastic range of equipment from one of the great computing companies.

There was something very special about Ken Olson's Digital Equipment Company and it continues to attract a lot of interest in its history and products. Its machines were innovative and high quality and its software, particularly OpenVMS, has almost fanatical advocates.

From a museum conservation point of view, the wealth of printed documentation that DEC produced is a godsend. Not only do we have all those handbooks, but also sales literature, operating manuals and complete technical descriptions of the systems themselves.

In our attempts at TNMOC to restore as many computers as we can, DEC's solid technical backup is priceless.

Although TNMOC has DEC systems from the early 1960s, we still rely on OpenVMS running on nice new HP (who merged with Compaq who had preciously acquired DEC) hardware for the internal services of the museum - our intranet for example.

We plan to re-instate our retro operating systems machine soon. This is a publicly accessible web server allowing different systems to be used in virtual machines - everything from CP/M to RSTS/E and VMS. If anyone has a spare DS10 we would love to hear from you!

TNMOC provided prizes for the best displays and presentations and it was a pleasure to able to attend the event. Congratulations to the winners and to the event organiser!

Many of the people exhibiting at this event will also be coming to the first Vintage Computer Festival in June so more people will have a chance to see these cared for systems running again.

Finally, readers of this blog might like to know that they can get even closer to TNMOC, by joining and becoming a member and receiving extra benefits and information.

The one and only man who kept pace with a computer for almost 30 minutes

| No Comments
| More

One of the joys of researching the history of computing is tracking down people who worked in the industry in the early days that are happy to share their memories. This week I was lucky to talk to such a gentleman who led a team of "human-computers" in 1952 in the run-up to the development of the Harwell-WITCH computer now being restored at TNMOC.

The early 1950s was a pioneering time in the UK and the computing frontier towns of the time were Manchester, London, Cambridge and a village near Oxford. Of course, the Oxfordshire village of Harwell did happen to have the UK's atomic energy research establishment on its doorstep!

Computing was an altogether more personal experience then: just a small team of young people equipped with voluminous mathematical tables and hand-cranked calculators.

At Harwell they were working week in and week out on calculations to support the design of Britain's first atomic power stations - calculations that could take days, if not weeks, to complete.

That is until the electronic division completed its first electronic computer and presented it to the human-computer team. It wasn't the fastest machine, but it was relentless and regularly worked well into the night.

The team of mathematicians weren't convinced of course, and took a great deal of persuading to adopt the new machine. My contact had set himself in a race against the computer. He managed to keep pace for nearly 30 minutes - until his arm ached from turning the calculator crank handle!

Given a few prompts, our pioneer also remembered just what the first program he wrote for the machine did, and the fact that it worked first time.

Perhaps we can all remember our first time, or at least when it worked!

Finally, while some people are preparing for the London Marathon, at TNMOC we are now going through a rigorous selection procedure for volunteers who will attempt another race when the Harwell/WITCH computer is restored and running. Those with strong arms are eagerly sought.

On Digital Archaeology

| No Comments
| More

One of the key speakers at the Vintage Computing Festival we are holding in June is Christine Finn, author of "Artifacts: an archaeologist's year in Silicon Valley".

At times I do feel like a traditional archaeologist with all their challenges, delights and disappointments. We don't often get to select our digging sites, but often get called in to be emergency archaeologists to rescue computers and systems from an imminent wrecking ball or the local authority tip.

We often hear tales from visitors about machines that are no longer used and that will be lost if we don't do something about it soon. Sadly, that machine is often beyond salvaging, and frequently it is a system that we would have liked to have had. The lost machine then magically acquires an almost mythical status - it would have been rare, complete, fully operational and with all the original manuals and software!

This past weekend saw the museum rescue gang heading off to a small office in Aylesbury to collect an ICL System 25 'together with disks, printer and all the manuals'. We didn't know quite what to expect. When we arrived the reason for the miraculous survival of this machine was apparent - it was at the far end of an office block, up a steep and complicated staircase! It was to require all our best (and improving) excavation techniques.

The System 25 was a very popular machine in the UK for small companies and in particular for point-of-sale systems. It was an unusual computer in that it operated in decimal rather than binary. It was based on the US designed Singer System 10, that had originally been designed by Friden. Singer acquired Friden, and then in 1976, ICL acquired Singer. The machine we were rescuing was a later model System 25+ consisting of the processor cabinet, a disk cabinet containing two EDS80 80MB removable disk drives, and a 132 column line printer.

The pair of 80MB disk drives were dismantled and carefully man-handled down the stairs, the removable covers from the (actually quite small) processor cabinet we removed and despite its lack of wheels carefully got down to the ground floor.

Which left the line printer that needed to be moved! In the 1980s line printers were built to last, no flimsy plastic or aluminium covers, but heavy sheet steel which also covered a large transformer and a heavy motor. It was impossible to take it apart, so five of us struggled with it down the stairs - one step at a time, with multiple stops to catch our breath! One hour later it was outside as good as new.

The huge collection of manuals is particularly interesting. It allows us to reconstruct the use of computers in this company since the purchase of their first System 10 in September 1980. The site log reports the engineers installing their first system, subsequent upgrades to an ICL System 25, and finally the installation of their System 25+. Their last ICL machine seems to have been the Unix based DRS3000.

I can only imagine the look of dismay when ICL delivered the machine and were told it was to be installed upstairs, but this simple fact seems to have saved the system for TNMOC! I suppose archaeologists find exactly the same thing: the more accessible, moveable artifacts have long since disappeared.

Our next steps are to examine and document the hardware we rescued, plan the operation to make safe copies of the disks, and of course to continue the research into the back story of this very popular British computer. Once we have secure copies of the system and application software, we plan to construct a System 25 emulator that anyone can run, and of course the system will be on display at the museum in our Large Systems gallery.







A Golden Age of Computing?

| 1 Comment
| More

I was recently asked to consider what might have been the Golden Age of Computing, and I think I have two answers.

With my serious computer history hat on, I opted for the 1960s.

We started the decade with a chaotic scene of computer manufacturers offering multiple and mutually incompatible products. Machines were supplied as 'bare-metal' and the end user began the job of writing his (and it was almost always "his") own application software in the full knowledge it would all need to be re-written when the machine was upgraded. Programming computers was still very technical and difficult, and relied heavily on an in-depth understanding of the particular machine's design.

But then computing came of age. Hardware was transistorised, reliability increased, and standard programming languages were created. Many like FORTRAN, COBOL and BASIC are still with us today. Computers stopped being just boys' toys!

Women like 'Steve' Shirley took a lead -- although the fact that she used the name "Steve" reminds us that she realised life was easier if potential customers initially thought you were another chap!

Our industry, Electronic Data Processing, was all conceived during the decade of flower power and free love.

By the end of the decade, computing had been transformed: applications could be bought off the shelf, mass-produced mini-computers were beginning to appear and computer purchasers could expect an immediate benefit.

A more personal answer is that I think we might each have our own golden age of computing, one that has a common theme. It was probably when we were in our mid-twenties, no longer a novice in the industry, but not so far up the corporate ladder that serious customers shouted at us and we didn't have to break bad news to the board! We had new trainees to boss around and we knew everything about the system - we were masters of the universe project!

So, taking my computer history hat off, long live the early 1980s! Just booting CP/M is still a thrill - so the Personal Computing Gallery at TNMOC is therefore one of my favourite haunts and a secret pleasure!

Coming soon: Random Access Memories

froleyr | No Comments
| More

Computer Weekly is very pleased to be working with The National Museum of Computing.

The new blog will be launching soon.







Subscribe to blog feed

Categories

 

-- Advertisement --

 

About this Entry

This page contains a single entry by Kevin Murrell published on April 23, 2010 11:03 AM.

DEC's Legacy in Hardware and Software was the previous entry in this blog.

Restoration: Impossible is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Recent Comments

  • ScottishYorkshireMan: Golden age of computing, I would have to concur with read more

Recent Assets

  • logo_computer_weekly.gif

 

-- Advertisement --