The blame game has begun and will reverberate until the lawyers are satiated. But who is responsible for users trusting systems they do not understand? “Any financial instrument that takes more than ten minutes to understand is no more than an attempt to defraud the unwary”. My first chief programmer, in 1969 was even ruder about software packages.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
“if they cannot explain in basic anglo-saxon what it does and how it does it. Tell them in even more basic anglo-saxon what they can do with it.”
I was taught the domino theory of systems. “Keep it simple, stupid”. Separate code from data. Compartmentalise complexity. Put cut-outs between the modules so that “failure” (including unanticipated extremes or confluences) triggers informed human intervention and/or graceful degradation.
A couple of days ago Guido Fawkes quoted Warren Buffett on the toxiicy of the assumptions underlying the concepts behind derivatives (in his entry “The greatest capitalist versus the Geeks of Capitalism”).
The coming slump is likely to see a similar reaction against over-complex systems at every level: from financial services, through beneftis and welfare to safety critical and national infrastructure.
This will, of course, be resisted most strongly by those whose business modeals and power bases are based on “big is beautiful” and “trust us, we may not know what our geeks are doing, but our consultants tell us they do”.
I once spent a year in the ICT management sciences unit as the senior public sector financial modellling consultant. I was a tiro among truly world-class experts, but I already understood the power of mixing large-scale data mining and complex analyses to help make well-informed decisions. I certainly learned the danger of relying on statistical correlations for which we could find no causal explanation, let alone relying on complex systems whose behaviour could not be predicted.
That experience also confirmed what I was taught at London Business School by Andrew Ehrenberg on the importance of data reduction: “if it looks different, its more like to be a data error than something significant – check the data first.”
I used those experiences in my spells outside the IT industry as a corporate planner and as an investment appraisal and “risk reduction” advisor. I am using them again today to try to predict what is likely to happen, to whom, over the next few years.
One of the messages from the Great Depression is that the technologies that had been over-hyped in the 1920s really did change the world in the 1930s and 1940s – but they did so incrementally and the share prices of the survivors did not recover until the 1950s.