In search of perfection

Lately I’ve been proof-checking my book “Managing the Human Factor in Information Security“. This type of exercise is a real eye-opener for anyone that sets out to achieve 100% error-free operations. The simple fact is that we simply can’t avoid or eliminate mistakes, no matter how hard we try.

I’ve checked the text of the book over and over again. It’s also been reviewed by three colleagues, my publishers and a team of professional proof-readers. Yet you can still find errors if you look hard enough. It’s the same with software, which typically has 20 to 30 bugs for every 1,000 lines of code. Our information systems are far from perfect.

Mistakes are caused by many different human factors, including negligence and stress, as well as lack of training or bad system design. Spotting errors is particularly hard as we generally only see what we’re expecting to see, so exceptions will go unnoticed.

We shouldn’t really blame individuals for causing accidental data breaches. In fact, it’s often the best performers that make most mistakes. That’s because they work harder and faster, and are more empowered. Yet whenever a big data breach occurs, the tendency is to hang the poor person who triggered it.

The safety field has long understood this problem. That’s why they use defence-in-depth (or the “Swiss cheese” model as they prefer to call it). We need more compensating controls around our systems and better checks and reminders in the systems themselves. Areas such as policy, training, supervision, system design and validation are weak ones in most organisations. That’s where we need to focus more effort.  

SearchCIO
SearchSecurity
SearchNetworking
SearchDataCenter
SearchDataManagement
Close