March 2010 Archives

Improving data quality

| 1 Comment | No TrackBacks
| More

For most of my career I've been concerned about the poor quality of most of our databases. It's been bad in most organisations I've encountered, and it's likely to get progressively worse with increasing centralisation and recycling of data. Joseph Juran, the famous quality expert, estimated the cost of poor quality data to be around 20-40% of sales turnover. In the public sector, the consequences of bad data can be highly damaging to the individuals affected. Standards would help, as would better discipline and tools at the point of capture.

Recently I was talking to George Barron and Ross Miller of Unified Software who operate an Internet based service, called BankVal, for verifying bank account information at the point of entry. They reckon that typically 8% or more of non-verified bank account details are incorrect and the average repair cost is £35 per record. The check itself costs pennies, so there are savings to be made from checks like this, not to mention the added security benefits.  

George tells me that in general 96% accuracy is about as good as may be expected from data capture, but under certain conditions, because of operator tiredness or interruptions for example, the accuracy can be far lower than this. A further problem is that banking data changes surprisingly quickly. The UK banking database, for example, comprises around 20,000 records, and in any given month, around 300-400 of these records change (a rate of change of 1.5-2% per month). An unmaintained banking database can therefore become seriously inaccurate within a short space of time.  

We need to encourage greater attention and priority to the issue of poor data quality. If the data management community cannot raise the subject higher up the management agenda, then perhaps it's time for security managers to add their weight to the issue.  

Head in the Clouds Computing

| No Comments | No TrackBacks
| More

The Centre for the Protection of National Infrastructure (CPNI) has just published a briefing note on Cloud Computing, compiled by Deloitte. It's a useful snapshot of the latest fashions and jargon on this fast moving subject. Unfortunately, it fails to connect with reality when it comes to the security recommendations.

None of the key recommendations strike me as being practical. They include measures such as customer-managed encryption, tough terms and conditions, consideration of all legal implications for every location involved, and heavyweight due diligence and auditing.

The whole point of Cloud Computing is that it's primarily a low cost, commodity service based on take-it-or-leave-it services from unspecified locations.

It would grind to a halt if every customer demanded different terms and conditions, attempted to encrypt their data and exercised audit rights. Risk comes with the territory. 

What's needed is coordination of customer security requirements and tougher, independently assessed security standards. As for encryption, I can't envisage how any service provider can adequately support a Software-as-a-Service application without access to the data.   

Defence in depth

| 1 Comment | No TrackBacks
| More
Two things drew my attention last week to the importance of defence in depth. One was a discussion about the economics of security, and the importance of ensuring that business cases take account of the need for additional layers of security from the outset. The other was a feature in Wired magazine last year brought to my attention by Team Cymru's excellent information service. 

Defence in depth is a long standing principle of security. It compensates for the inescapable fact that all countermeasures fail from time to time, for a variety of reasons, including human failings, technical glitches, insider threats or simply because they are not resistant to all forms of attack. A similar model, the 'Swiss cheese' approach, is used in the safety field, which has always accepted that mistakes are an inevitable fact of life. The difference with security is that there are a lot more deliberate, determined threats to subvert countermeasures.
 
Defence in depth can reduce the overall cost of security, as well as compensate for known deficiencies in countermeasures. The latter use is especially important in physical intrusion detection systems, as monitoring technologies have known weaknesses. That explains why bank vaults require so many layers of security, and also why these measures can occasionally be defeated by an exceptionally sophisticated attacker with inside knowledge. This last point was underlined by the article in Wired magazine, based on an interview with Leonardo Notarbartolo, sentenced for 10 years for leading a five-man gang who broke into a vault in 2003 beneath the Antwerp Diamond Centre and made off with $100 million worth of diamonds, gold, jewellery and other spoils. 

The vault was thought to be impenetrable, protected behind 10 layers of security, including infrared heat detectors, Doppler radar, a magnetic field, a seismic sensor, and a lock with 100 million possible combinations. Clearly, this was a flawed assumption. No technical or human countermeasures are completely foolproof. Each can be circumvented, given enough knowledge, time and determination. That's why it's not a sensible strategy to leave a vault containing such valuable contents unattended over a weekend. Nor is it sensible to rely on a single layer of protective security for sensitive data in a networked infrastructure. 

Encouraging SMEs to address security

| 1 Comment | No TrackBacks
| More
I've been busy over the last week finalising some interesting research work for the Information Commissioner's Office on security advice for SMEs. It has some groundbreaking recommendations. Hopefully it will help to deliver the long overdue boost we desperately need to persuade SMEs to address security. The main problem is that they don't really want to know. Security is a 'grudge purchase'. But it certainly helps to assemble some suitable, complete and up-to-date advice, and erect signposts where SMEs are likely to look.
 
I presented some of the findings of this research at last week's ISSA UK meeting in London, and was taken aback by how well the ideas were received. Amongst other things, it underlines three key realities. Firstly, SMEs are important. Secondly, supply chains matter. And thirdly, a different approach is needed. Shoehorning ISO 27000 standards into an SME environment is certainly not the answer. Anyone interested in catching my presentation on this subject should look out for it at next month's meeting of Martin Smith's excellent Security Awareness Special Interest Group in London. 

Why we are vulnerable to cyber attacks

| 3 Comments | No TrackBacks
| More
The news today has several reports of a recent surge in cyber attacks originating in China. The Times quotes US analysts as saying that the West had no effective response and that EU systems were especially vulnerable because most cyber security efforts were left to member states. US official reports indicate that attacks on Congress and other government agencies have risen exponentially in the past year to an estimated 1.6 billion every month. 

It's no surprise. Security professionals and government authorities have been fully aware of the risk for decades. The root cause is a widespread failure to implement effective governance, monitoring and education processes. Fifteen years after the publication of BS 7799, most enterprises have yet to implement it effectively. Too many organisations have been bogged down in policy, risk analysis and paperwork rather than implementation, awareness and auditing. Given current progress it's likely to take another decade to overcome this failing.   

Cloud Security Challenge

| No Comments | No TrackBacks
| More
I encounter many innovative start-up companies who exist solely on grants and awards. Without this support we simply wouldn't have the range of products that we desperately need to solve emerging security solutions. So I'm pleased to see that the Global Security Challenge, together with HP Labs, has launched the first cloud competition to discover innovative new solutions that will help protect governments and enterprises as they adopt the Cloud. 
The award includes a $10,000 cash grant sponsored by HP's central advanced research group HP Labs, as well as mentorship by an executive from Capgemini. Up to three finalists will also be invited to test their technology in an HP Labs cloud test-bed. The Challenge is supported by the Cloud Security Alliance. Entry is free and the deadline for entries is Monday 15 March 2010.

What's different about Cloud Computing

| 1 Comment | No TrackBacks
| More
My blog posting yesterday, criticising the Cloud Security Alliance's paper on Top Threats to Cloud Computing created a few comments and discussions on whether the risks are actually any different from other forms of in-house or outsourced computing. Here's my take. 

Cloud computing is a rich subject, with many variants of service delivery and service usage. The risks vary considerably, but one thing is guaranteed: you lose visibility and control of what's happening to your data. 

From a threat perspective, the only difference is that a large collection of data will attract attacks that an individual organisation might not. From a vulnerability perspective the main difference from conventional outsourcing is that you're buying a standard service, so you can't expect the same scope for due diligence inspections, negotiation of terms and personalization of security. 

For more on this, look out for my forthcoming book "Managing Security in Outsourced and Off-shored Environments: How to safeguard intellectual assets in a virtual business world" expected to be published by BSI in May. The book has a most attractive cover, featuring a clown fish in a sea anemone. The clown fish is one of very few species of fish that can avoid the potent poison of a sea anemone, so it's an appropriate analogy.

Top Threats to Cloud Computing?

| 2 Comments | No TrackBacks
| More
When is a threat not a threat? The answer is when it's selected by someone who does not understand the correct terminology. 

In fact this happens a lot when you ask ordinary business managers to name their top risks. Instead of a list of risks, you often get a bunch of issues, problems or subject areas, rather than risks: things like 'compliance' or 'privacy'. But a risk is an event, not a subject area; something for which we can assign a probability of occurrence within a specific period. 

You don't expect to see this type of sloppy analysis coming from a collection of leading security experts, especially one that is aiming to teach the rest of us how to go about security. So I was surprised to find that the 'Top Threats to Cloud Computing' just published by the Cloud Security Alliance contains little about specific threats, but plenty of waffle about general IT security problem areas. 

Some of the threats are vulnerabilities, such as 'Insecure Application Programming Interfaces' or 'Shared Technology Vulnerabilities'. One of them, 'Unknown Risk Profile' is not a risk at all but the absence of a risk assessment. The rest too general to be of any use, such as 'Malicious Insiders', 'Data Loss or Leakage' and 'Abuse and Nefarious Use of Cloud Computing'. 

This paper can be largely summed up in one sentence: "Cloud Computing presents the same risks of fraud and data breaches as any large, outsourced critical business service. You need to follow good security practices." Unfortunately, such concise wisdom would not come across as a major advance of the start of the art.  

About Archives

This page contains links to all the archived content.

Find recent content on the main index.

Archives

 

-- Advertisement --