Throughout 40 years of involvement with computing I have been made regularly aware that it is usually the glamorous, exciting and ambitious projects that fail and the dull, boring ones that succeed. Most people in the business know this. Yet the world has refused to acknowledge it and has insisted on giving computing a value and status it has been happy to accept but has done little to deserve.
It all culminated in the 1990s. It was not enough that computing and telecommunications technology had become an important element in the global economy. That was true. But a common assertion was that it was the overriding and decisive factor, changing everything. That was hopelessly false. Remember those exciting concepts - new paradigms, the weightless world, the new economy, the information revolution, e-everything - common only a few months ago but already embarrassingly limp and dated?
There have been scandals galore but arguably the greatest was the year 2000 problem, otherwise known as Y2K or the millennium bug. It was hugely expensive. One government estimate put the cost for the UK alone at about £25bn. That is about 30 times more than the much-derided Millennium Dome. Yet Y2K is already largely forgotten. A pity - probably more than any other example it illuminates the reasons why IT has failed to live up to its own claims and to the perception that so many had of it.
A common view is that the whole thing was a scare story.
Not so: Y2K was a real problem and solving it a vast and tedious job. And, because of the dedicated, unglamorous and costly work done by thousands of middle-ranking and junior IT people, the job was largely successful. Had it not been, governments and international business would have been damaged, possibly severely - an open goal to the enemies of the West. We should be grateful.
No, the scandal was that it happened at all.
The Y2K problem was wholly unnecessary - the result of an absurd failure of professionalism and discipline. How could an industry that took itself so seriously overlook the obvious fact that using two digits to represent the year risked huge difficulties at the end of the century? This foolishness put the developed world at wholly unnecessary risk.
That alone is a serious indictment of the IT industry. But there are two other issues - and they may be even more damaging.
Why was the vast Y2K problem essentially solved, when major IT projects so often fail and almost none is completed on time?
And why, despite the fact that there were numerous failures, were the consequences not remotely as dire as many had feared? The answers say a lot about the industry.
The answer to the first is simple: in contrast to many IT projects, no one would benefit if this one was late - its deadline was truly non-negotiable, the job had to be done and management mostly understood why. Many other projects would succeed if these factors were present - the remarkable thing is that they so rarely are.
Nonetheless, there were many thousands of problems. A recently reported UK example was the date-change error that caused 158 pregnant women to be incorrectly assessed for Down's syndrome. Happily, most were less serious.
But why did all these problems not have devastating consequences? After all, it had been widely asserted that computing systems were such an essential and basic feature of our lives, with such complex and barely understood interconnections, that numerous relatively minor failures would have grim consequences - "death by a thousand cuts".
The failures occurred but not the consequences. Why not? Again the answer is embarrassingly simple: people do not find it so very difficult to cope with and work around computer failure. Interconnectivity just is not so pervasive or dangerous as we had been assured. Computers are not such an essential part of our lives.
The computing industry would benefit by learning from this and adopting a rather humbler role. It has a lot to be humble about - in particular because, far from establishing change on a par with the industrial revolution as has been claimed so often, the industry has failed even to demonstrate that the introduction of computers has done anything much to improve industrial productivity.
This so-called "productivity paradox" has been noted for years and is well summarised in a recent report by consultancy McKinsey which says, "Contrary to conventional wisdom, widespread application of IT was not the most important cause of the post-1995 productivity acceleration." And that was a US analysis - the position elsewhere is almost certainly worse.
It is a rum revolution that effects no radical change.
Of course, there is no dispute that computing has introduced important developments into the social, economic and political life of most people in the developed world. Although some have been more of a nuisance than a benefit, probably most people would agree that many have been for the better.
The problem is the absurdly grandiose claims made for them - especially, as noted, claims of dramatic social and economic revolution. The reality is rather different: it now seems most unlikely that the technological changes seen at the end of the 20th century will have anything like the impact of the technological changes seen at its beginning.
It is possible that events may alter that perception. In the 1450s when Gutenberg introduced the use of movable type it may not have seemed a particularly important development: no one foresaw how it would directly power the profound religious changes initiated by Martin Luther 70 years later. These launched a genuine revolution - arguably the most significant transfer of power and property in the past 500 years.
So it might be foolish to dismiss altogether the possibility that today's technological developments might one day trigger far reaching change. But such an outcome becomes increasingly less likely.
How quickly dreams fade.
Robin Guenier was executive director of independent Y2K lobby group Taskforce 2000