A sense of crisis hung over RSA 2008 in San Francisco last week.
Increasing network complexity, identity authentication, tougher regulation, poorer enforcement, a free-for-all for collecting and storing personal data, and the consumerisation of network-enabled devices combined to induce a feeling that things cannot go on as they have any more.
What Gladwell said was significant for any businesses or government law enforcement agency bewildered and confused by what works in data leakage prevention, virtualisation, encryption, Web 2.0, secure coding and embedded code.
Everyone, it seems, is keen to capture and store more and more data about people. The cost of storage is dropping, making it increasingly feasible. But Gladwell showed that access to more data does not help experts reach the right conclusion. In fact, experts function better with less, not more. "[Expert] judgment is frugal [in its use of information]," he said.
The lesson was that more data is not the answer. However, thousands or millions of exposures to similar events or artefacts over years made it easy for experts to recognise and compare patterns and to pick the exceptions correctly. This should have profound implications for information system design as well as data collection and mining in law enforcement, health care and marketing, to name just some applications.
This may be why many speakers invoked the 80-20 rule. Adrian Seccombe, CISO at Eli Lilly, said the pharmaceutical maker has conducted an exhaustive data classification as part of a risk assessment exercise to see what data and transactions it should protect. "Only 15% of our data was too precious to expose to the internet under any circumstance," he said. "The rest of it we could pretty much put out there."
Ari Takanen, CTO of code checking software house Codenomicon, said systems such as his were growing in popularity because they caught 80% of the coding errors that could induce vulnerabilities in the final system. "They will never be perfect, but they do catch the basic errors and give experts time to find the more subtle errors," he said. "If nothing else, they raise the bar against the bad guys."
The need to produce secure code from the outset was an underlying theme of the conference. There was universal acceptance that the web has massively expanded the threats, with new network-enabled consumer devices reaching the market daily.
This complexity was forcing companies to reconsider what was crucial to their business. The resulting data was often at odds with legal and regulatory requirements, several speakers said. Even homeland security secretary Michael Chertoff acknowledged this. He said national security, especially of the critical national infrastructure, depended on a public-private partnership, with each playing their role.
Jake Olcott, director of the emerging threats, cyber security, science and technology subcommittee of Congress's Homeland Security committee, appealed for a dialogue between the IT sector and the committee. It was vital to get laws passed that were reasonable and practicable, he said, but many on Capitol Hill were ignorant of the issues and unaware that some laws had unintended consequences.
"Right now there are a lot of mixed messages on cybersecurity," he said. "Congress is hearing them all, but no one is in overall charge."RSA 2008 conference round-up >>