The latest data breach in Leicester City Council , putting some of the most vulnerable in society at immediate risk from potential local predators, illustrates why it is so important to implement the recommendations of the DCMS Select Committee report – Cyber Security – Protection of Data Security On line . Had the individual found guilty of stealing personal data from the Council earlier this year been sent to jail instead of being fined £150, it is unlikely that some-one would have sent details of vulnerable children and adults to most of the taxi firms in the City – whether accidentally or deliberately. The jail sentences for unlawfully obtaining or selling personal data in the 2008 Criminal Justice and Immigration Act would not prevent such egregious carelessness. But the weakness of fining the organisation, not the individual, for carelessness also illustrates the need to fine those who do not train their staff and audit and test their processes – as called for by the Select Committee.
See my blog immediately after the report was published for a summary of the recommendations then read the full report. It is under 30 pages.
Unless the UK implementation of the GDPR addresses those recommendations it will not improve the situation. Data Breach notification merely alerts the sharks to a pool of potential victims. Meanwhile many of the obligations will serve mainly to increase the reluctance of Directors to take responsibility for what they do not understand and cannot control. Meanwhile the data protection and cyber security industries are obsessed with compliance technobabble and expensive products and services that do little, if anything, to reduce the likely of serious breaches or improve the privacy of potential victims.
At the same time we are ignoring the elephant in the room. Those promoting big data business models, whether in the public or private sector, need to take responsibility for the consequences. Those intent on hoovering up large amounts of personal data (including from children’s mobile phones and smart toys) need to be held personally, not just corporately, liable for its misuse. This is particularly so in the public sector where breaches merely lead to a merry-go-round of OPM (other peoples money).
The provision of supposedly anonymised databases for “research” purposes also needs serious scrutiny before we get too enthusiastic about Big Data research institutes. About the only people who appear “serious” about the privacy and security of our medical records, for example, are the insurance pharmaceutical companies who come in for so much flack. Academic researchers tend to “know” that their work is so much more important – as well as lacking the necessary security knowledge, let alone budgets. Meanwhile NHS data protection depends on nearly half a million individuals having a more robust attitude to security than 40,000 German Enigma operators in World War 2. Currently most think that hare accurate information available, when and where it is needed for patient care should be more important. Are they wrong?
I also recommend pondering the Select Committee’s polite but devastating comments on the failure of the relevant IT professional bodies and trade associations to condemn innately insecure products, services and development practices. The Internet Engineering Task Force meets in London in March. I would love to see demonstrations of users outside the venue calling on those, in practice, who run the Internet to live up to their responsibilities to the rest of society, not that their “toy” is part of the world’s critical infrastructure.