Journey to the future

Over the past 40 years computers have transformed social and business life, despite remaining pretty unintelligent. Now the challenge is to get them thinking for themselves, writes Cliff Saran

Over the past 40 years computers have transformed social and business life, despite remaining pretty unintelligent. Now the challenge is to get them thinking for themselves, writes Cliff Saran

When we began looking back on the 40 years of IT since Computer Weekly was launched, one thing became apparent. No matter how much technology has improved, no matter how far microelectronics has revolutionised the computer industry, computers are basically doing the same type of job.

In the 60s and 70s, science fiction writers presented a bleak future as artificial intelligence battled with human beings. Experts spoke of mass redundancies as automation took over. And the latest buzz was the paperless office.

Yet, by and large, computers still only do what they are told to do, and they are pretty useless without a program that instructs them, step by step, how to perform a task. Luckily for the human race, intelligence within these machines is still rudimentary. There is little chance that microchips will rule the world.

Instead, the past four decades has seen the use of IT explode in ways few people could have predicted. Computers do not have much intelligence on their own, but they have supported billions of business decisions by providing users with business intelligence data.

IT has enabled businesses to connect both to their customers and with suppliers and business partners. Call centres have given the public a way to keep in contact with a business or government department 24 hours a day, seven days a week.

Sales orders, financial transactions and communications are sent in microseconds around the world. Customers can order products directly over the internet and the supplier's warehouse automatically dispatches the goods.

PC maker Dell came from nowhere to become the largest PC manufacturer in the world by going direct to the ­customer and allowing them to customise the PC they want to purchase. Amazon.com changed the way we buy books, CDs and DVDs.

The electronics industry has been revolutionised as researchers push the boundaries of integrated circuit design, allowing the creation of faster, smaller, more complex chips. And this has driven the adoption of IT into every­day lives. Even a humble toaster has a microprocessor with a program for the defrost, toast and cancel controls.

Who could have imagined we would all be carrying mobile phones with built in video cameras, capable of sending and receiving e-mail and accessing the internet? Not too long ago this was real James Bond stuff.

When Sony introduced the Walkman in 1979 it changed the way people listened to music - 60 minutes of music in a portable tape player. Today, thanks to breakthroughs in computer memory and hard discs, MP3 players can hold thousands of hours of music on a single memory stick. A computer that used to be the size of a room now comfortably fits on a wristwatch.

Today there are two main challenges. The first is finding a use for the massive amount of computer power at our fingertips. After all, computers are still pretty dumb on the evolutionary scale of artificial intelligence.

Even if an application can be found, the second problem is the management and technical difficulties that must be overcome in rolling out any IT project.

It is not that hard to demonstrate a proof of concept. But can the application scale, and will end-users be prepared to run it? Will the technical staff and the business consultants be sufficiently skilled to implement and deploy the system?

Can the software and hardware be rolled out cost-effectively? Is licensing prohibitive? Is the IT architecture and coding resilient enough to enable all end-users to run the application, without risking crashes, hacking attacks or the need for constant patching?

One of the themes of Computer Weekly during our first 40 years has been the need for strong project management. It can be the difference between success and failure. As the IT industry continues to mature, project management will move further up the agenda.

If we can get this right, maybe one day intelligent IT systems will improve the quality of our home and work lives, computers will be self-managing, and downtime will be a thing of the past.

Read article: IT Greats poll winners

Read article: IT's trip from the sixties

Read article: Government IT - Where did our £25bn go?

Read article: Past and future trends


This was last published in October 2006

Read more on Data centre hardware

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close