It's the instrumentation, stupid

I prefer to avoid clichés, but this snowclone heading seems to best capture the missing dimension in the current debate on cyber defence. Judging by the latest tome from Chatham House, we can expect years of academic posturing on the principles, strategy and ethics of cyber war, while no one seems much interested in addressing the underlying problem of badly designed instrumentation.

This week I saw Barnaby Jack of IOActive in London demonstrating how to hack an ATM. It follows a recent (unrelated) report by a US analyst speculting that a recent spate of ATM failures might have been the result of malware infection. At the same time Symantec published an analysis indicating that the Stuxnet worm appears to have been designed to sabotage uranium enrichment equipment. On top of that we’ve had claims that China hijacked a huge amount of Internet traffic earlier this year.   

These vulnerabilities suggest that Die Hard 4.0 might be more real than we imagine, perhaps even understated in terms of the worst case damage. The modern world is heavily dependent on process control and supervisory systems. Yet for decades we have been building critical infrastructure with insufficient concern for security. Legacy designs and code appear to be littered with vulnerabilities and far-from-best practices. We’ve even had one report of a hard-coded password in a SCADA chip. 

Isn’t it time that Government mandated stricter standards, better architecture, training, code review and testing of systems supporting critical national infrastructure?