It’s good that organisations are now coming clean about breaches of customer credit card data, though it’s worrying that there are so many of these incidents. Yesterday Newcastle City Council announced an “inappropriate release” of up to 54,000 credit and debit card details covering transactions earlier this year. The Council became aware of this breach when they hired an external security specialist to test its security. The testers discovered that a file had wrongly been placed on an insecure server and had been uploaded to an address registered outside the UK. Fortunately, some of the most sensitive data items (the credit card numbers) were encrypted. But this breach should never have happened.
On the surface, it might appear that it was human error that triggered this breach. But the root cause of such incidents is always deeper. Organisations should take special care when designing and implementing systems that process sensitive customer data, especially those that connect to the Internet. Controls should be designed to take account of human failings. People make mistakes from time to time, so compensating measures are needed to prevent control lapses from turning into breaches. Crossed fingers are not good enough. Regular penetration testing is essential but not sufficient to monitor security. Such tests need to be backed up with real-time vulnerability scanning.
As the safety community has long understood, behind every major incident there are likely to be, on average, around thirty minor incidents and three hundred near misses. And perhaps many more bad practices.