In 1981 I was in charge of computing in BP Chemicals and was invited to give the keynote speech to an audience of computer manager at Univac's International Executive Centre near Nice. The title was to be a rather ponderous "Data Processing through the 1980s in a multiple site process manufacturing company".
In that long-forgotten speech I made a number of predictions, and having only recently unearthed the script I have been tempted to try and see to what extent, if at all, those predictions have come to pass in the intervening 28 years. First, however, I must explain, very briefly, how things were at that time, since those predictions had to be made from the status quo ante.
The portfolio of applications run on mainframe computers in 1981 had mostly been developed during the previous 10-15 years, and many were still batch-based. They had little interconnection with other applications and, for some, their data entry was still by means of punched cards encoded from specially designed input documents.
Essentially, they had their origins in the basic design philosophy which gave rise to the old punched card systems; had been developed piecemeal; and had links between them added on only when the need was identified. They reflected the fragmented growth of Data Processing in those early years whereas, in contrast, our process manufacturing companies operate processes which by their very nature are integrated.
The challenge we faced was to integrate (ie interconnect) our computer systems so they could provide quicker and better information for those directing our company's affairs. And because their reactions impact and repercuss elsewhere in the company's processes, this input and the changes consequential upon it had to be handled by our computers without delay.
I foresaw the implication of those demands as requiring the widespread use of database technology (still in its infancy in 1981) with access to it freely available via VDUs having effective software for data retrieval, manipulation, and report generation purposes. All this with sufficient computing capacity that access could be gained at any time without having to wait.
In my speech I proposed the following to meet the above requirements:-
(a) An 'outer circle' of mini computers being used at outstations for dedicated functions, and using software packages developed specially for those particular functions.
(b) Computers located in the major (regional) sites being used specifically for site purposes and using the standard package approach to applications.
(c) A corporate mainframe computer holding corporate data at fairly elemental level in a structured database, supported by good data retrieval/report generator software.
All three layers to be connected by a communications network linking them together. The blueprint I postulated shows the coalescence of computing and data communications, functions which in 1981 were usually organisationally separate - as was office automation. Sooner or later, I argued, their functions would inevitably be brought into one grouping, which would probably be called information processing rather than data processing.
At my present age of 80, I have been away from the computing world for many years and do not pretend to be au fait with current technology or trends. I notice, though, that we now hear and read about IT, which embraces the different functions I mention above. And where I predicted the increased use of VDUs (mini computers), every office seems to have a PC on every desk. The considerable advance in Data Processing since 1981 is, to me, most noticeable when I visit the bank and my transaction is immediately reflected on the print-out of my account entries I sometimes request. I marvel at the speed of communication, and the efficiency and versatility of the software embedded in a database in some distant centre.
Modern computing has progressed considerably over the past 28 years, but I like to think that I may have been more prescient than I ever realised when I gave that keynote speech in 1981.