I attended a very good session at the RSA Conference Europe in London this afternoon, entitled “Privacy Concerns with Adopting DLP Technology”. The panel, which comprised RSA’s Katie Curtin-Mestre, FFW’s Stewart Room, and SAS’ Yngve Sunnanbo, considered the privacy implications from intelligent monitoring of the organisation’s boundary traffic.
Data Loss Protection (DLP) takes content scanning to the next level by inspecting traffic at a number of levels (including the much-loathed DPI) to identify security risks that might be missed by a regular scanner. Systems may, for example, look for email content that leaves the organisation at 5pm, and returns in a modified form at 7am, which might indicate an employee emailing work home rather than using a more secure method of transfer, then emailing it back when it’s complete. Clearly this is the sort of insecure behaviour that organisations need to stop, and DLP is a valuable tool to protect security, and hence privacy, of information.
However, like all tools, you can cut yourself with it if you use it incorrectly: DLP will automatically gather large amounts of personal and sensitive personal information, and there is a risk that organisations using may inadvertently infringe the privacy of employees or third parties during investigations. Furthermore, the DLP log will itself be very sensitive, and must be protected appropriately.
I was particular interested in Stewart’s advice in the Q&A, in which he reiterated the importance of intention and action for data protection compliance: say what you’re going to do, then do it. Stewart is the author of Butterworth’s Data Security Law/Practice, so he knows what he’s talking about here.* He also pointed out the importance of transparency in managing the DLP logs: that the log data will, in most cases, be considered personally identifiable, and therefore subject to the Data Protection Act, including the right of access by the Data Subject. In other words, the employee or data subject concerned can demand access to the information held in the log about them. Furthermore, under FoI rules, public bodies operating a DLP system should be prepared to have to provide statistical data about the system’s logs, which might have the unintended consequence of revealing the extent of security problems encountered within the organisation.
This of course isn’t a good reason not to implement DLP, but it’s good advice for any organisation that’s installing a system without having properly considered the consequences.
* Declaration of interest: I’ve no commercial link with Stewart.