The line, "if you've got nothing to hide, you have nothing to worry about" is used all too often in defending surveillance overreach.
While the argument applies to some problems, it represents a very narrow way of looking at privacy, especially given the array of privacy problems mixed up in government data collection and use beyond surveillance and disclosure. The “nothing to hide” argument, ultimately, has nothing to say.
A European paper issued by Michael Friedewald distinguishes seven types of privacy:
Privacy of the person encompasses the right to keep body functions and body characteristics (such as genetic codes and biometrics) private;
Privacy of behaviour and action includes sensitive issues such as sexual preferences and habits, political activities and religious practices;
Privacy of communication aims to avoid the interception of communications, including mail interception, the use of bugs, directional microphones, telephone or wireless communication interception or recording and access to email messages;
Privacy of data and image includes concerns about making sure that individuals’ data is not automatically available to other individuals and organisations and that people can “exercise a substantial degree of control over that data and its use”;
Privacy of thoughts and feelings refers to the right not to share their thoughts or feelings or to have those thoughts or feelings revealed. Individuals should have the right to think whatever they like;
Privacy of location and space means individuals have the right to move about in public or semi-public space without being identified, tracked or monitored;
Privacy of association (including group privacy) is concerned with people’s right to associate with whomever they wish, without being monitored.
Considering the full spectrum of privacy, people must ask themselves: Are you sure you are comfortable with all of your characteristics in the public domain?
For example, do you want people to know where you spend your time - and who you like to spend it with? If you called a substance abuse counsellor, a suicide hotline or a divorce lawyer? What websites you read daily? The religious and political groups to which you belong?
Key privacy questions
Furthermore, as big data grows, enterprises need a robust data privacy solution to help prevent breaches and enforce security in a complex IT environment. In an Isaca whitepaper entitled Privacy and Big Data, we identify key questions that enterprises must ask and answer, which – if ignored – expose the enterprise to greater risk and damage. Here are five of them:
- Can we trust our sources of big data?
- What information are we collecting without exposing the enterprise to legal and regulatory battles?
- How will we protect our sources, our processes and our decisions from theft and corruption?
- What policies are in place to ensure that employees keep stakeholder information confidential during and after employment?
- What actions are we taking that create trends that can be exploited by our rivals?
Shifting privacy standards
Furthermore, the service provider may change this policy. Everybody remembers the Instagram case. In December 2012, Instagram said it had the perpetual right to sell users' photographs for advertising purposes without payment or notification. Due to the strong reaction, Instagram backed down.
This brings to the forefront the fact that many consumers are poorly educated about how their personal data is collected by companies and are unsure about what it is actually used for. Investigation into the recent implementation of the EU Cookie Law has highlighted how misinformed consumers in Europe are.
In response to cookie banners, 56% of users said they either accepted or agreed to the site using cookies, or ignored the notices and simply carried on. Another 17% said they typically did not give permission for cookies, even if it meant not using the site. This corresponds closely to the percentage of those who said they generally browsed the internet with cookies disabled.
All of this is soon to be compounded by wearable technology, such as Google Glass, which is essentially a phone in front of your eyes with a front-facing camera. A heads-up display with facial recognition and eye-tracking technology can show icons or stats hovering above people you recognise, give directions as you walk and take video from your point of view.
In July 2013, Google published a new, more extensive FAQ on Google Glass. There are nine questions and answers listed under a section named Glass Security & Privacy, with several concentrating on the camera and video functionality.
But, crucially, this does not solve other privacy concerns.
Google Glass tracks your eye movements and makes data requests based on where you are looking. This means the device collects information without active permission. Eye movements are largely unconscious and have significant psychological meanings. For example, eye movements show who you are attracted to and how you weigh your purchase options when shopping.
How many of you will turn off your Glass while punching in your PIN? How about when a person's credit card is visible from the edge of your vision? How about when opening your bills, filing out tax information or filing out a health form? Remember that computers can recognise numbers and letters blazingly fast – even a passing glance as you walk past a stranger's wallet can mean that the device on your face learns his/her credit card number. All of this information can be compromised with a security breach, revealing both the information of the one using Google Glass and the people they surround themselves with.
On 4 July 2013, Chris Barrett, a documentary filmmaker, was wearing Google Glass for a fireworks show in Wildwood, New Jersey, when he happened upon a boardwalk brawl and subsequent arrest. The fact that the glasses were relatively unnoticeable made a big difference: "I think if I had a bigger camera there, the kid would probably have punched me," Barrett said. The hands-free aspect of using Google Glass to record a scene made a big difference.
Privacy: Intrinsic right or social construct?
Privacy is entering a time of flux and social norms and legal systems are trying to catch up with the changes that digital technology has brought about. Privacy is a complex construct, influenced by many factors, and it can be difficult to future-proof business plans so they keep up with evolving technological developments and consumer expectations about the topic.
One way to ensure there are no surprises around privacy is by seeing it not as a right, but rather as an exchange between people and organisations, bound by the same principles of trust that facilitate effective social and business relationships.
This is an alternative to the approach of "privacy as right" that instead positions privacy as a social construct to be explicitly negotiated so it is appropriate to the social context in which the exchange takes place.
Isaca notes that enterprises eager to reap the benefits of big data and its vast potential must also recognise their responsibility to protect the privacy of the personal data gathered and analysed with big data. Risk management and maintaining adequate mechanisms to govern and protect privacy need to be major areas of focus in any big data initiative.
The lengthy privacy policies, thick with legalese that most services use now, will never go away, but better controls will, and should, emerge. Whatever tools are used to protect and collect personal data in the future, it will be important for companies such as Facebook and Google to educate their consumers and to provide them with options for all levels of privacy.
Yves LeRoux will be addressing privacy issues at the 2013 European Computer Audit, Control and Security (EuroCACS)/Information Security and Risk Management (ISRM) conference hosted by Isaca.
Yves Le Roux (pictured) is chair of Isaca’s Data Privacy Task Force and a principal consultant at CA Technologies.
This was first published in October 2013