benedetti68 - Fotolia
As the fallout from the abolition of the Safe Harbour transatlantic data-transfer scheme continues, maybe it is time to look at how much of an issue data sovereignty really is.
There is little point arguing the case for holding data in unsafe places. Data loss incidents can cause problems for the companies involved, as well as major issues for the individuals whose information has been lost – putting them at heightened risk of identity theft or financial losses.
Indeed, when the EU General Data Protection Regulation (GDPR) comes into force in 2018, companies will need to disclose details of a data breach within 72 hours of its discovery, which could have huge implications for a brand.
But will keeping all of an organisation’s data in one country avoid such issues?
Not in the slightest. The chances of a UK-based data facility being breached are no better or worse than one in Russia, for example.
Any reputable, well-run facility will be as good as any other on a global basis. Likewise, Joe and his dog running a server room from a garage in Moss Side, Manchester will be no better or worse than Vlad and his wolf running one out of Irkutsk, Siberia.
The main issue is in protecting data against accidental and malicious loss. Data loss prevention (DLP) and digital rights management (DRM) technologies can help, along with those capable of encrypting both data on the move and at rest.
A level of intruder detection and prevention, through traffic pattern matching and individual user actions, can also help throttle or prevent malicious attacks.
But that does not seem to be the issue troubling the minds of CIOs who are already mindful of how Safe Harbour’s successor and the introduction of GDPR will affect their operations. The issue here seems to be more around data sovereignty – and what this mean for an organisation’s data?
Crossing the border
The idea with data sovereignty is that data has a ‘home’, which is defined as being within the lines arbitrarily drawn on maps over the years as countries have battled and acquired land, split up and become new countries, joined larger trading blocs and started asserting their authority on the world stage.
This can result in the emergence of rules, such as in Germany, that any data that can be used to identify an individual must, by law, be held on German soil.
Now, let’s look at that in a practical light. Data comprises a set of binary ones and zeros, which are made up of electronic pulses. These pulses are electrons that have no understanding of the concept of borders.
Therefore, to meet the requirements of such laws, either the government involved must apply data ‘walls’ around its country, or put the onus on all the organisations within its borders to ensure this information remains where it should be.
Read more about data sovereignty
- If your cloud provider stores your data outside of your country, does it mean you still need to follow local laws?
- Addressing data sovereignty concerns means European CIOs are forced to pay more for cloud services than their US counterparts.
There is little chance that governments will put any effort into doing this because they have no clue how to go about it and it would be horribly expensive.
And anyway, even though a theoretical model can be built up where data sovereignty works, at a practical level, it is impossible.
The internet only works as it does because of the intelligence being built into its fabric. Without this, it would slow to a crawl – or just not work at all.
A large part of this intelligence is in the use of data caching at either a core network level, whereby switches and routers store data before forwarding it on again, or at a software level through the use of content delivery networks (CDNs).
Flushing all these intelligent data caches to ensure they hold no personally identifiable information would be incredibly difficult – and would ruin the basic model of the internet.
The state of surveillance
Another top concern of CIOs, apart from fears about data loss and theft, is governments getting access to their organisation’s data.
In light of the Edward Snowden revelations about the surveillance activities of the US National Security Agency (NSA), CIOs are right to worry about such things – yes?
Well, yes and no. Firstly, look at the data your organisation holds. Is it likely that NSA will want to trawl through your retail database to find out how many left-handed, blue-eyed, 34-year-old single mothers bought lip salve on a Saturday? Is it really going to use backdoors to access your ERP system and find out you have 20,000 pencils in your inventory, or that you haven’t paid the invoice for a boiler service carried out three months ago?
It is unlikely. Sure – if you are in the armaments industry, oil and gas, pharmaceuticals or other high-intellectual property value markets, you may have cause to worry about an outside government getting access to your data. For everyone else, it’s unlikely to be a problem.
What is an issue, though, is that backdoors might not be used solely by governments. As soon as the bad guys knew they existed, they started throwing money at finding and exploiting them.
Organisations with software or hardware featuring any government-sanctioned backdoors can expect – at some stage – to be met with a ransomware demand. Or just find that their customer details have been sold to the highest bidder without their knowledge.
Overall, data sovereignty is a red herring. There is little point in worrying about it – and it will become increasingly difficult for governments to crack down on data not held within a single country or region.
However, work on the basis that your data will be global, that it will be compromised at some point, and create a compliance-oriented architecture – one where all information is secured, no matter where it resides.
This is a much better approach than just being able to point at a storage array and say: “I am compliant with the law – my data is held there.” As a colleague of mine says – use a ‘strong box’, not a ‘tick box’ approach.