Fake News - The Movie

A video of yesterdays ISOC England round table on Fake News is now available on-line. It is unedited and therefore begins, as do most webcast events, with some embarrassing technical problems but stay with it. It was a good disucssion, ably stirred by Maria Farrell with points emerging that I have not heard at all before, let alone juxtaposed.  In my own first slot I reprised the arguments in my previous blog on the subject but by the end of the round table my views regarding the points of leverage had moved on – partly in the light of what I heard from my fellow panelists, but also as a result of points made by a very well-informed audience, some of whom had serious skin in the game . I apologise for grinning at the camera and thus giving the idea that I thought the topic a big joke. Some subjects, like religion, are too serious to be taken seriously. You end up either crying or wanting to kill some-one. I also apologise for getting the numbers wrong in my off-the-cuff illustration of how modern day fake news kills.

What should you watch out for in “Fake News: the Movie”?

Karim Palant gave a very measured account of Facebook Policy which left me wondering what more we really want them to do. The idea of Facebook (or its lawyers or a panel of “the great and the good”) acting as a censor, deciding what should be carried, left me cold. I prefer the view that they improve their processes for rapid response to well-founded (???) complaints and better enforce their own terms and conditions, plus their obligations under the e-commerce directive.

Dominic Connor described the squeeze on “professional” journalists. On-line news services commonly receive less than 10 – 15% of what well-known brands pay to have their products and services advertised alongside breaking news. ISPs, Search Engines and the many and various Ad Tech “intermediaries” take the other 85 – 90%. Meanwhile the increasing pressure to be first in an on-line environment means that those who pause to check the provenance of the story do not get the clicks.

After Dominic’s comments my perception of the reasons for the rising tide of palpably fake news began to change. Most is driven by the business models of the $16 billion pay-per-click adtech fraud industry. The Macedonian teenagers who concocted the fake news stories about Hilary Clinton and Donald Trump monitored which headlines got the most clicks. They then produced more of the same. Most of us cannot tell the difference between fake and real by looking at the supposed source of the story. We are lured into fake-news sites (e.g. imitations of MSN News) via links to apparently reasonable stories and then led onto the bizarre. Meanwhile the next generation is no better than us at telling what is for real. Lacking experience of past propaganda they may be even more gullible.

But a collapse in confidence is imminent. Diageo, Jaguar or Proctor and Gamble may not be too worried about fake political news – but they are concerned when their adverts appear alongside the Jihadist videos and Porn which collect clicks from under-age teenagers.  Hence the pressures on Google, Facebook and Twitter to prevent pay-per-click ad fraud from killing the geese that lay their golden eggs.  That priority indicates the approaches we can expect them to take – beginning with trying to use technology to identify and block the 60-70% of traffic generated by botnets at the same time as making it easier for legitimate users to report abuse.

That leaves us with “traditional”, non-automated fake news, including the gulf in “perceptions of reality” between the Internet Digerati (a subset of the Western Liberal Elite) and the majority of humanity. It was during an exchange over who was telling the truth and who was censoring who with regard to Brexit, that I had a second revelation: about how that gulf had opened up in the UK. [In the UK it is between the Metropolitan elites and the rest. In the US it is more between the outward looking Coastal elites and the introverted, rust bucket, middle].

In the UK a combination of the BBC and The Internet wiped out the local newspapers which used to train English journalists to understand and reflect the prejudices of their readers, as opposed to those of the politically correct, liberal, metropolitan elite. In parallel we have seen the opening up of  gaps in the ability to share news and views (whether true or false) over social media: e.g. between students with gigabit services on campus, middle class teenagers with passable broadband to their smart phones and socially deprived NEETs, stuck with “crap (copper, rust, aluminum and other pollutants) band” and notspots in inner cities and rural areas.

Most of the older age groups never did believe what the London newspapers and BBC told them. Meanwhile backbench MPs worried more about what their constituency newspapers said than the Times or the Guardian. Now the local newsprint that helped us understand regional differences has gone. We are left with an illusion of homogeneity plus a reliance on-line social media, with its susceptibility to being overwhelmed by botnet multiplied news and views from “who knows who …”  The success of Daesh in using on-line media to recruit disaffected youth illustrates the vulnerability of Western Society to a latter day Goebbels. The appeal of Pied Piper politics to those with no memory of the socialist dictatoriat (both central and local government)  of the 1960s and 1970s is no Fake News joke.

My concluding remarks came after those of Gabrielle Guillemin, of Article 19. The last time I appeared on a platform with some-one from Article 19 was back in 2000, at an event hosted by the Freedom Forum when the topic was “What Price Freedom?” To my surprise the transcript is still available on-line . My comments show their age. I then expected CISCO and IBM to support secure walled gardens. I was, however, all too right about the inability of law enforcement to respond. I was also right about authentication and identity being the key. But I predicted e-zombie status (i.e. no credit) for those insisting on anonymity.  I did not foresee the number of operations offering anonymity for automated money laundering services, using a variety of technologies, not just bitcoin. One of the tragedies is, however, that the linked anonymity services are being used by national security agencies to track those who think they cannot be identified.

One of the most perceptive comments at the ISOC event was from a 14 year old whose main fear was that his secure anonymous persona might be linked to his home address and some-one would come round to beat him up.  It was interesting to link his concerns to those of parents or police which I reflected in my recent Snapchat blog. 

Meanwhile Joanna Kulesza, who introduced the discussion was disappointed that there was not more discussion on algorithms. I think that after two hours we deserved a drink – but this does indeed deserve a discussion of it own – including, of course, the fake news about the assumptions behind them, how they behave in consequence, who is competent to use them, let alone interpret the results and so on. As a sometime student of Andrew Ehrenberg (who described the American approach to modelling as the scientification of non-knowledge)   I used to be expert in unraveling complex algorithms to reveal the two or three unknowable assumptions on which they depended. But now my brain hurts when I try. I will leave that meeting to others – I simply ask who gets sued if the consequences of believing in the “answers” turns out to be disastrously wrong.