Google privacy check gives greater cause for concern

Google raised fresh cause for concern today when it offered an opt-out of its web tracking by changing the privacy settings in Gmail.

I took up its offer to opt out of various schemes it has to track people’s web browsing and to use the content of my email communications to tailor its advertising and shape what they see when they use its search engine.

The last point is the most pertinent. It has been disconcerting for a while to see both Google and Twitter appear to shunt things in front of you that happen to coincide with topics of private email conversation.

Google’s privacy opt-out confirmed for people too busy to check if the blogosphere had already nailed the question: does Google tailor results according to the content of your emails?

The opt-out was most welcome – commendable, really – because it gave hope that such behavioural profiling as might grow from Google’s tailored search results in other areas of life might after all become an option and not an imposition. It might even be possible that the coming age of behavioural and psychological profiling will deliver tools we can choose to put in our hands rather than a tyranny wielded by others.

Then Google offered to install a ‘Google Analytics Opt-out Browser Add-on’ to stop my web browsing being tracked by websites that use Google software for just that purpose. On installation, Firefox said ‘don’t install this add-on if you don’t trust its author’.

Dick Dastardly

Some Firefox add-ons get certified by Mozilla, Firefox’s publisher. To the casual user, this is a guarantee that the add-on software has been analysed to ensure it doesn’t steal your personal information or leak your browsing history to some tracking service or another.

As a journalist, this is a source of niggling fear.

…As it might be for campaigners who take on powerful corporations or states, or for revolutionaries disgusted by entrenched poverty and inequality of opportunity, or anyone who irritates comfortably powerful people. So you use Firefox add-ons cautiously. Partly because Firefox tells you to. And when an author like Google who especially needs to convince you it can be trusted says, ‘install my add-on: it’s good for you’, you expect really to see that it is certified.

Whether you trust the author is an interesting question in respect of Google. Firefox’s community of add-on authors has produced numerous blockers designed to stop Google and other web analytics software from tracking what you do when you browse the web.

They are are like a bunch of Heath Robinsons all offering slightly different ways of getting to the moon. You don’t really expect any of them to get there. But you are glad they are having a go. You meanwhile browse the web with the expectation that you could be tracked by anyone who cared enough to do it.

Then Google enters the Heath Robinson Race to the Moon with its ‘Google Analytics Opt-out Browser Add-on’. It is as though Dick Dastardly had appeared with an Acme Rocket Car, beckoning: ‘Hey kids – don’t waste your time with those bozos! Come along with me!’.

The result of all this has been a genuine cause for concern. One of the most welcome settings in Google’s privacy options this morning offered to stop it tailoring its web search results to your behavioural profile.

“Our automated systems analyse your content (including emails) to provide you with personally relevant product features, such as customised search results, tailored advertising and spam and malware detection,” says its privacy policy.

Befooling users

While it might encourage some that Google works to give them search results related to their prior behaviour before they see anything else, it also raises the possibility that there is information it excludes from results.

It raises the possibility that there are parts of the net invisible to anyone deemed not to have the authority to see it, just as there are services denied people without the credit history to use them.

If they haven’t driven before, it’s hard even for a middle-aged woman to get insurance to learn in a Golf GTI. It sounds silly that someone might learn to drive in a GTI. But really it is ridiculous to stop someone doing so if that happens to be the car they have. As blunt instruments, insurance algorithms are inherently unjust.

Door men

The injustice inherent in algorithm-driven social systems will cause greater concern in years to come when the likes of the government’s ‘Verify‘ online ID system become established. It works by assessing people’s entitlement to do something according to the behavioural history and biographical credentials attached to their identity. If your name is on the list but it doesn’t have ticks by the right boxes with the credit reference agencies and data aggregators, you aint gettin in.

So what do you get to see when you do a Google search nowadays? While doing extensive web research on the global network system that drives military drone strikes in the Middle East last year, it became apparent to me that Google search results where tailored according to the specificity of your search terms.

So Google might deliver no results at all if you searched for ‘Rocket Car’. But it would give results if you searched for ‘Dick Dastardly Rocket Car v.1.54 Specifications and Patented Chuff Propulsion Schematic’.
Specific search terms that contained key words unique to a particular domain of knowledge produced results, while more general search terms did not, even when they should because they logically encompassed everything that might appear in the more specific search.
Google showed what you were looking for if you knew what you were talking about – if you could use the language of a domain expert – of an insider.

Less than trustworthy

So Google’s offer today of an opt-out from tailored search results offers a little less assurance than certainty of trust.

What might Google exclude from your search results if you opted out of its service to filter them according to your behavioural profile?

What might it exclude if you opted in?

Exploration of this question may lead one to the conclusion that Google’s database must be turned into a public asset that can be searched without condition. Or users must simply be given power to set the search conditions themselves.

That wouldn’t negate the utility of a tool that filtered results according to the domain in which a user operated. A bus driver searching for ‘engines’ might prefer to see different results than a computer programmer.

But Google will have broken Silicon Valley’s promise of a classless, democratic society if it has not one day, hopefully soon, given all users the means, regardless of their own biographical credentials, to adopt the identity of any class of user to see what they see, and commit search terms such as:

‘show me what a bus driver sees when he searches for information about engines’

… or …

 ‘show me what the prime minister sees when he searches for information on the error rate of drone strikes targeted against suspected insurgents in the Middle East’.

These are hypothetical examples. Still, if Google fails to deliver such tools then the time may have come for peer-to-peer search engines that operate beyond the reach of corporate and political control.

As it happens, the sort of algorithm-driven assessments that determine a government ID system’s decision whether the sum of someone’s biographical credentials amount to enough of an entitlement to give them access to some service or some domain of privilege were originally designed by a collaboration of tech companies that included Google. They were designed to give states and corporations the means to assess within a given range of probability whether someone’s claim for some service or privilege was genuine.

Similar assessments determine whether the network of evidence against some suspected insurgent is great enough to guarantee within a range of probability that they are indeed a dangerous terrorist bent on causing imminent harm to the people around them, and that it would be safer for the people around them, given also the range of error inherent in any given missile strike under given mission conditions and so on, that they should be executed immediately, without trial, with a missile fired from a drone. The Whitehouse has refused to state what it has deemed an acceptable rate of error to be in these operations. Computer Weekly asked. It said no comment.

Ironically, the people-powered information revolution promised long ago by our present prime minister hand-in-hand with Google, has not yet given the people the same algorithmic assessment of the probability that their results were complete and their entitlement lacking.

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

Google uses SQL.

SQL, Structured Query Language obtains patterns from queries and statistics on how often they are used; the queries have nothing in common with data itself, they are EXTERNAL.

Google spies on Internet for EXTERNAL to the data its descriptions.

I, however, discovered and patented how to structure any data without SQL, the queries - INTERNALLY: Language has its own INTERNAL parsing, indexing and statistics and can be structured INTERNALLY. (For more details please browse on my name ‘Ilya Geller’.)

My method obtains all required patterns and statistics from data INTERNALLY, it does not need any EXTERNAL information, any spying. For instance, there are two sentences:

a) 'Sam!’

b) 'A loud ringing of one of the bells was followed by the appearance of a smart chambermaid in the upper sleeping gallery, who, after tapping at one of the doors, and receiving a request from within, called over the balustrades -'Sam!'.'

Evidently, that the 'Sam' has different importance into both sentences, in regard to extra information in both. This distinction is reflected as the phrases, which contain 'Sam', weights: the first has 1, the second – 0.08; the greater weight signifies stronger emotional ‘acuteness’; where the weight refers to the frequency that a phrase occurs in relation to other phrases.

My technology converts the data into understandable to computer format.

SQL cannot produce the above statistics – SQL is obsolete and out of business.

Google is out of business.

Cancel

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close