Connecting the dots ... by Robert Dowell

| No Comments
| More

As the future of computing moves ever further into the realms of science fiction, so the importance of the human-machine interface takes new and unexplored roles. Robert Dowell, a new TNMOC volunteer explores the early days of software interfaces and how they affected him.

For me, the summer term of 1979 was largely forgettable, with the minor exception of my first experience with a computer.  It was a hot dusty time of year and I remember walking into what was a largely abandoned language lab, where a small corner had been given over to a mysterious new object, bathed in a warm yellow glow from the summer sun which shone through the windows.

All those who were interested had been told that this was, in time, going to be the new computer room, but other than that, very little else was said.  We, as students, were free to use this machine, as long as we treated the squat orange box in front of an old black and white television with great respect.  Lying next to it was a portable tape deck and a few sheets of badly photocopied paper, scattered around the desk in small disparate piles.

With the exception of the careful instructions that told me how to turn it on, little else made any sense.  Having performed the easy bit, my feeling of accomplishment vanished rapidly, replaced with despair.  On the television screen six words in two lines - '(M)onitor (C)old Start (W)arm Start' followed by 'Select'.  I remember thinking, 'Okay, what is that supposed to mean and what do I do next?'.  Like every other inexperienced child, I feared that if I pressed the wrong key, it would all end in tears.

Being more than a little shy and only 11 years old, I was too embarrassed to ask, so I looked for anything that would help identify a solution to this problem.  It was a case of understanding all the words in the sentence, but having no idea what any of it meant.

Within a few hours I had it partially sorted and like a misty morning that clears as the sun burns it off, my mind became clearer as I typed in my first program:  a game called 'Four in a row'.  With renewed confidence and a sense that I could finally excel at something, I became a regular user of the Compukit UK101. 

At that time, it all seemed such innocent fun as I explored the depths to which the machine would allow my then fledgling imagination to travel. 

As a growing adolescent I never stopped long enough to consider the implications of the technology on the society around me.  How this would not only change the environment I lived in, but also how it would shape my use of that technology. 

As I grew up, I watched the plethora of new computers appear and like many others I wondered how much more could we have in the same sized box, or how much faster it could go.  Like any technological undercurrent that excites those in its vicinity, we all got swept up by the wonderful vision that was changing, improving, moulding and pushing the hardware harder and faster in directions we never expected.

As our understanding of the human machine interface improved, certain technologies - that in the past decade would have seemed ideal - failed to either perform socially or were found to be lacking due to limitations in the hardware.  It was not the dream that failed us, but our expectations versus reality that brought many ideas crashing down with a resounding thud.

In many ways, the user interface in the early days of computing was a rudimentary link to the cold hard logic circuits within the machine.  As time progressed, the distance between the user and the hardware expanded, separated by the human machine interfaces driven by software.  This has helped facilitate a greater connection to the abstract that allow us to perform ever more complex tasks.

During the expansive years of the 60s, many ideas of the future of computing were formed by visionaries such as Doug Engelbart, who, towards the end of that decade, believed that the tight links between hardware and software would eventually allow us to work in highly creative ways, using abstracts taken from real life. 

Throughout the early years of computing much was governed by the hardware and software limitations. It is even more amazing that both the mouse as a pointing device and the idea of workgroup computing were invented.  Inevitably this led to other directions that built on earlier ideas of how we could interact with a computer.

Earlier still Alan Turing described what he believed to be the 'as yet unheard of' computer and that it could be more than the sum of its parts - based on his belief and understanding at the time.  His basic premise provided an intellectual framework for scientists to work within; thus giving birth to the first digital programmable electronic computer - Colossus.

Early developments in the technology required changes to the way the human-machine interface operated and, as is often the way, necessity becomes the mother of invention.  During the early years, pioneers of the technology, who had been touched by its digital circuitry, had an intimate understanding of the panels, switches and lights that told them everything they needed to know.  To anyone else who looked upon the marvel of the modern age, this made little sense.  If the computer was ever going to come out of the scientific curiosity closet, then it was going to require an interface that was a little more friendly.  As increasingly novel ways of manipulating data, designs and technologies arrived, it led to the simplification and streamlining of the human-machine interface.

All computers have basic similarities, but as we move further away from those early years and into the new millennium, that is where the similarities end.  As software methodology and understanding advance, so will the changes that drive our dreams.  At the forefront of this change, will be the human-machine interface, as it meanders in an unpredictable socially driven future.

The past is littered with attempts to change the way we input data, but unless they offer a tangible improvement on that which we are comfortable with, they will inevitably fail to spark the imagination in out society; Just because something is technologically advanced, doesn't mean we want it.  Technology in today's society is a strange animal, more akin to the age-old conundrum of chicken and egg: Is it the technology that changes society or the society that drives the technological change?

Whatever the answer, the need for simple yet abstract constructs within the software interface will surely mould the society more than the technology.  As we gain a greater understanding of the way we work, and how we can use the tools at our disposal, the creative nature of the human spirit will surely have a greater impact on the future of the human-machine interface than the technology that drives it.

If you don't believe me try to remember, if you can, what it was like before email, the internet, the mouse, high resolution displays, laptops, mobile phones, hard disks, USB and finally the Graphic User Interface.  It's just a case of connecting the dots...

Robert Dowell

 

Leave a comment

Categories

 

-- Advertisement --

 

About this Entry

This page contains a single entry by Peter Vaughan published on February 27, 2011 5:12 PM.

Connecting the dots ... by Robert Dowell was the previous entry in this blog.

No sign of Yesterday - Part I is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

 

-- Advertisement --