Scientific American has an interesting article "How Would the U.S. Respond to a Nightmare Cyber Attack?" based on a recent crisis exercise.
It's a good question as well as a well overdue exercise. I've been concerned about such an exposure for the past two decades. At the turn of the century I predicted that the electronic Pearl Harbor would probably not happen until 2006, by which time several trends (in threats, vulnerabilities and dependencies) would simultaneously peak or mature, resulting in a step change in the risk profile. I called this the "critical convergence period". My forecast was not based on uninformed speculation, but on analysis of technology road mapping exercises carried out by subject matter experts.
As with many well-researched forecasts I got the outcome right but the timing wrong. Yet it's taken more than a decade for the world to wake up to the problem. At the time of my forecast ZDNet criticised me for being a "doomsayer". Some CISOs suggested I had it wrong and that the future threat landscape was more likely to be one full of numerous minor incidents rather than any big disasters. Six years later Professor Fred Piper drew attention to this debate posing the question "Who is right?" Judging by current national risk assessments, it seems that I was, though too far ahead with my timing.
We are still in the very early stages of educating managers about how to respond to a major crisis. The Scientific American article reveals a number of weaknesses, though these are to be expected as strategic crisis management capabilities are very weak across the board. For example:
- "The teams had to come up with a response within hours. The pressure was intense." (In a real situation they'd have less time to decide a response.)
- "As the situation grew more serious, the consensus for diplomatic engagement dissolved." (This suggests a fundamental weakness in strategy. Thank goodness we don't manage nuclear confrontations that way.)
- "The 19 groups suddenly diverged considerably about what the proper response should be." (Challenge and diversity of options are both very good, as long as a consensus can be quickly reached through logical debate.)
Why is crisis management so weak? The answer is very simple: it's because it requires a number of skills and traits that are in short supply. This is not helped by the fact that few individuals understand how to do it, and even fewer take the trouble to teach others how to do it.
My book "Managing the Human Factor in Information Security" has a good chapter on the subject, explaining many of the nuances of the art of strategic crisis management that are unlikely to be developed by self-taught CERT teams.
In practice each crisis team has to develop its own unique blend of skills and chemistry. These need to include: a capability to manage large amounts of incoming information; the skill to piece together what's really happening from incomplete reports; the discipline to focus on the immediate problems at hand; the imagination to develop a compelling strategy to guide the broader response; the ability to communicate effectively to external stakeholders; and the ambition to leverage the power of the full resources available across the organization. Most importantly team members need good listening skills, which are always in short supply, especially in a fast-moving crisis team environment.