In the early 20th century IBM's punched card machines became early examples of automated information processing in many large companies. At the same time the major technology leap of the valve allowed the creation of the Bletchley Park code-breaking establishment. Leo was the first example of commercial usage of this valve technology shortly after the Second World War.
By now the term computer was becoming popular. Then the transistor was invented and computers became smaller, cooler and more reliable. The technology trend has accelerated and the focus switched to enhancing the input/output delivery mechanisms, while the cost of processing continued to decrease as its reliability increased with the advent of mass-produced silicon chips.
Excuse the history lesson, but it shows that no matter how far computers have come, the exponential growth of computing power does not necessarily match our ability to utilise it in a cost effective manner.
Historically technology vendors have always grossly oversold the advantages of the evolving technology. For example, as package software became more widely available in the late 1970s and early 1980s there were many claims that costs could be reduced radically by businesses changing the underlying processes to conform with the package constraints. The label business process re-engineering (BPR) became very popular as a result.
Of course BPR rapidly became synonymous with redundancy, as the underlying reality was that cost cutting was essential during the Thatcher years of increased competition and BPR was a suitably anodyne label. Did those software packages actually result in cost savings through the standardisation of business processes? In my view they did not. I well remember the costly bespoke code that was required to integrate packages into the existing infrastructure.
History teaches us to beware of inflated claims that IT is effective in reducing costs. How many times do we read of projects going over budget or being delayed? This is because business expectations are often so over-optimistic you have to question the professionalism of those who created them.
The entire dotcom boom was based on a naive belief that technology could now do anything we wanted it to do at almost zero cost - and if you were not part of that bandwagon then your business would cease to exist. Why were we all taken in? Y2K is another example of technology vendors using the fear factor to leverage sales of equipment and software to totally unnecessary levels.
Of course, we are now living with the effects of the dotcom bust and the Y2K feeding frenzy: the World Trade Organisation recently said the consequent slump in demand for IT was the main cause of the downturn in the global economy last year. And now business investment in IT is currently at an all-time low as companies attempt to get some benefit out of their recent profligacy and profit margins are squeezed remorselessly.
Any analysis of past events is only really useful if it gives us pointers for the way ahead, and indicates clearly the IT role in business. The individual running the IT function has to walk that fine line between protecting the business from the excesses of the vapourware from vendor sales teams while extracting maximum value from realistic technology investments.
The current trend to ever more meaningless titles for the person running IT (what on earth is a chief information officer?) tends to obscure what that individual is employed to actually achieve. Despite the dotcom debacle there seems to be an ever-increasing belief that businesses must adopt Internet technologies even if there are cheaper, more effective and more reliable alternatives already available. The trend towards jumping at the first bit of new jargon and selling it within the business as a core delivery mechanism will end in tears. Yet, despite the many examples of the recent past that appears to be precisely what some heads of IT are determined to do.
Unfortunately this lemming approach impacts on all of us. As always we, the silent majority who are perfectly capable and experienced IT managers, will have our reputations tarnished. We provide robust, reliable and cost effective solutions to our businesses and we carefully evaluate evolving technology and wait for it to stabilise before we use it. Unfortunately the minority who are seduced by the siren calls of the vapourware vendors are the ones who make the high-profile mistakes that result in the entire profession being regarded as a group of tech heads who couldn't manage a piss-up in a brewery.
Obviously there are situations where the latest technology is relevant, particularly in the provision of Internet distribution channels. There are, however, other areas where the risks are far higher and probably not worth taking except as a controlled experiment not involving the core business. The particular example that comes to mind is the area of Web services and the whole .net experiment. The theory sounds great but practical and cost-effective applications are in the minority. Would you bet the core business of a major corporation on this technology at this stage of its development?
Following Marlon Brando's advice in The Godfather I think it is time "to go to the mattresses". We are at war. Our professional status is being attacked by people who flit from job to job at the first hint of a problem. You all know the type: they talk a good talk but have never really achieved anything of major complexity or sophistication. Unfortunately they can employ meaningless jargon to impress unsuspecting boards and then create chaos before having to move on before they are found out.
Our job is to deliver to our organisations the most cost-effective yet imaginative use of IT, which is, above all else, reliable. The only way of doing that is to use stable technology and not to be seduced by sales people who must sell the latest release to maintain their business' revenue targets and their commission.
To the barricades - we are under attack!