By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
"Computing is an industry overwhelmed by its own success," said Thomas, founder of systems engineering company Praxis in 1983 and now an independent consultant.
"Each generation of software developers gets misled by the rapid obsolescence of individual component technologies and assumes that there is nothing fundamental and lasting that they should study from the past.
"They ignore the experience of their predecessors until someone rediscovers ideas that were well known, assumes they have made a breakthrough and announces an imperfect variant of an old idea under a new name.
"In behaving like this, the software industry acts as if it were a fashion business, like hairdressing, crossed with a new-age mysticism and backed by the sales patter of a snake oil salesman.
"In fact, when innovation is at its most rapid, an industry has the seemingly paradoxical need to make the greatest use of lessons from the past.
"Designing and building computer systems is science-based engineering, in the same way that designing and building automobiles or jet engines or power stations is engineering."
Some basic lessons from the past and from engineering were being ignored, Thomas said. "Engineers communicate through documents and must work from the correct versions to maintain control," he said. "But despite this universal engineering practice most software developers work from, and distribute, documents that have no change history or version number.
"One of the most important - and most frequently overlooked - software design principles is that the design should be decomposed into modules." This minimises growth in complexity of a single program.
Thomas pointed out that structured modular design dated back to the 1970s, and he said elements of this approach underpinned the principles of good object-oriented design.
Thomas said system complexity was almost all implemented in software, which had mixed benefits. "This reduces the complexity of the hardware, often to the point where commodity components can be used - at the price of increased complexity in the software."
Complexity led to design problems and greater risk of error, Thomas said, and the problem lay partly in complex requirements.
"Requirements usually can be simplified," he said. "They are often complex because they contain unnecessary implementation detail or elaboration of many special cases that could be better expressed as a general principle."
The ability to match precise lines of code with individual requirements made maintenance easier, Thomas said. "Without such traceability it is difficult, expensive and error-prone to implement changes to the requirements or to modify the software design."
Thomas underlined the importance of eliminating software errors as early as possible. "Much of the cost of software derives from the detection and removal of faults that were introduced earlier in the development."
He questioned standards that set targets for failure probabilities and suggested a new focus on guaranteeing certain specific properties.