Data sharing has become one of the toughest technology topics for the public sector. Our strategies are being driven by the need to gather and exchange huge amounts of personal information within and between authorities. But the majority of the most significant data loss incidents of recent times have been linked to a failure to share data properly: either through gathering and processing excessive information, or sharing it through insecure means because legacy systems do not support our current needs. We have to revisit some of our basic assumptions about service delivery if we are to move forward from our current problems.
Nobody disputes that government has a tough job to do. In an environment that demands ever-greater service efficiencies, agendas such as national security, child protection and healthcare create life-and-death situations that simply don’t exist in the private sector. The systems and processes that deliver public services are often ancient, cumbersome and so diverse that any hope of a national approach to data sharing is nigh-impossible. Our political climate is one where IT is seen as a panacea, where policy U-turns are not tolerated, and big IT projects are very much a la mode. And these challenges are compounded by a recession which is increasing efficiency pressures and stifling innovation.
But innovation is going to be our saviour. We have to embrace radical change if we are to keep pace with delivery needs, and to do this it’s time to challenge some of our assumptions about data sharing. We need to start thinking much more radically about how we architect our delivery processes. To keep things brief, I’d like to offer three examples of assumptions that must be ditched and reconsidered for the 21st Century:
- Our first assumption is that the way to share data is to gather it and then send copies to those that require it within and beyond our public authorities. This approach disregards well-established database federation approaches that do away with the need to create copies, thus reducing data management costs and simplifying data protection management.
- Secondly, we assume that it is the duty of public authorities to ‘push’ tailored services to individuals, and this is a cornerstone of transformational government strategy. But if we were to give individuals the option of a ‘pull’ service – one where they retain ownership of their personal information and provide it as they require access to a service, then we would open the market to emerging Vendor Relationship Management models that could do away with the need to hold data altogether.
- Finally, the third assumption that sits heavily on the minds of central government is that National Security, Serious and Organised Crime, and Child Protection ‘trump’ all other requirements in system functionality. This is distorting the scope and focus of almost every system we set out to build, since simple requirements are almost inevitably ‘hijacked’ with additional functions to justify the business need. If a system has to deliver national security objectives, then that’s all well and good. If it doesn’t, let’s stop using those needs to justify the system’s existence.
So what to do about it?
Clearly the solution to this is going to be extremely complex. I’d like to make three recommendations that would, in my opinion, set us on a path towards improving data handling practices and minimising data loss incidents.
We need to educate policy-setters in the language of privacy, identity and security. They simply don’t have the taxonomy to discuss critical concepts of privacy and identity. Hopefully, MPs now understand that no system can be kept 100% secure, even if it does contain their expenses, but policymakers also have to understand how to specify new systems. For example, we rarely have to ‘identify’ anyone outside of a border control or law enforcement environment – instead, we need to verify their credentials. But all too often the policymakers are unable to express their wishes, and we end up building yet another ID system that will gather huge amounts of personal information unnecessarily.
Secondly, there is a pressing need for prescriptive standards for security and data protection that can be applied across all public authorities, not just central government. All too often I come across local authorities where the IT staff aren’t vetted to a sufficiently high level to be able to read the Manual of Protective Security and other standards that they should in fact be using to protect their own systems. Local government cannot afford to push all its staff through clearance and then apply Cheltenham’s requirements to all its systems. We need a pragmatic new set of rules for data management. The emerging data protection approaches from the BSI and BCS are good first steps to help this, but there’s going to need to be policy changes from the very top before they become useful.
Finally, I’d argue that now – in the middle of a recession – is precisely the time to innovate. We need to challenge our assumptions about what is expected of public authorities; about how we procure IT and from whom; about whether we should be collecting or sharing personal information at all. We need bold, brave thinking, and those who have to do it need to know that they will be supported if there are failures, not pilloried by the media and left out in the cold by their managers. It’s innovation that will put an end to our data loss problems, and build a platform for 21st Century information management.
[This article is the text from a panel speech I delivered at GC Live on 9th June. Many thanks to the GC Live team for the chance to speak at an excellent event ]