The Right To Have Facts Redacted (But Not Forgotten) In Certain Contexts

| No Comments
| More

…or “How the Reputation Management Industry Came of Age"

Much fuss has been made in the press about the European Court of Justice’s decision that search engines (and Google in particular) must enable a ‘right to be forgotten’ - that is, that certain search results must be disregarded if the data subject can substantiate that they are not relevant to the search. Some of the best coverage of this comes from Chris Pounder, who reflects on the misinformation and press coverage and points out that Google routinely informs users when results have been changed at the request of a third party.

Google has implemented the ruling, and its process requires the user to prove that they are the data subject (and in all likelihood to check that the data subject is an EU citizen) and to put forward their reasons for the redaction - and a redaction is what it is: when Google removes results, the existence of a search result is noted at the foot of the search results, but the result is not provided.

The fact that Google notifies users when a search has been redacted is an important privacy protection, and one for which Google should be applauded: without that transparency, we might never be aware when a change has taken place, which in turn opens up a path for censorship and manipulation. Censorship is only truly effective if it is covert; if users are made aware that something has been modified, then they at least stand a chance of tracking it down.

But the idea that personal data might be struck from a search database as a result of this ruling is a fallacy: the rate of data collection, aggregation, sharing and analysis in any search engine is such that any ‘forgotten’ (i.e. deleted) reference would most likely be repopulated in a matter of hours, thereby rendering the original request to be forgotten redundant. So in order to comply with this requirement, Google and others will have to maintain a register of ‘redacted terms’ and possible ‘redacted URLs' - those search results which have been deemed as forgettable. 

That gives rise to the inevitable question about who determines what is a reasonable assertion for taking down a search result? Google has an advisory committee that oversees the process, and which has had to preside over 12,000 requests and counting in a matter of days. That’s too many requests for any sensible scrutiny of each one, so it’s reasonable to assume they’ll either set the bar very high or very low for such takedowns to be accepted.

And how do they judge the validity of a takedown request? For example, let’s imagine that a celebrity broadcaster with a history of charitable works is convicted for a string of sexual assaults. Should the individuals whom he supported be able to take down search references to his name bringing up associations with their names? I imagine that the broadcaster would want his charitable works to remain on record, and he might even argue for his own takedown request so that if someone searches on his name, plus the beneficiary of his charitable work, then results showing his conviction should not show up. 

That’s not a process that is going to operate on an Internet-scale very easily.

Some commentators have suggested that this is the end of free speech on the Internet, and that politicians and corporates will use the ruling as a way to stifle or manipulate freedom of speech. That’s certainly a potential risk, particularly if this ruling were to stand (it will be challenged), if it were applied to all search facilities (e.g. within newspaper websites), and if search engines cease to notify users of modifications to search results. But the Internet has a habit of finding its way round such obstacles, and I’m confident it will this time as well.

The most significant outcome, at least in the short term, is likely to be the benefit for reputation management companies, who will be able to sell ‘right to be forgotten’ services to individuals, where the data subject notifies the company, which in turn notifies all the major search providers and checks for compliance with that notification. Search providers will probably welcome such a service if it saves them having to operate their own advisory committees.

 So, the ‘right to be forgotten?’ Not a very accurate description. I’d like to propose the ‘right to have facts redacted (but not forgotten) in certain contexts, until we figure out a better way to live with our mistakes’ as a more meaningful and useful term.*

 

* And one which demonstrates why I’ve never pursued a career in product branding

Leave a comment

Disclaimer

The views expressed in this blog are my own, and do not necessarily reflect those of any client or other organisation.

Subscribe to blog feed

Archives

Categories

Toby on Twitter

    Recent Comments

    Biometric sc on Biometric travel controls... : They're starting this in a few US airports as an e...
    Toby Stevens on Biometric travel controls... : I'm hearing rumours that all is not what it seems ...
    Grant Alan F on Biometric travel controls... : I like the way this airport is using biometrics. ...
    Andrej on Biometric travel controls... : Usually at an airport the most time for a passenge...
    Joel on Biometric travel controls... : I agree its only a matter of time before the Gover...

     

    -- Advertisement --