Yesterday I blogged on the irony of releasing “The Intercept” between the 70th Birthday of Colossus and Safer Internet Day. Today I am delighted to be able to blog on the release of the summary and transcript of the round table organised by the Digital Policy Alliance during the run up to Safer Internet Day. The round table was chaired by Diana Johnson MP (Shadow Minister for Home Affairs) and was unusual in that it may have produced immediate results: two members of the audience had to leave early to introduce an amendment to the Children and Families Bill with cross party support, using some of the material presented.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
I reproduce the summary report below, but I do recommend you click through to the full transcript and ponder the detailed comments by Chris Ratcliffe (Portland TV) on the current state of play with regard to age verification, by Peter Davies (outgoing head of CEOP) on some of the legal loopholes that need to be closed and by Sally Leivesley (Newrisk) on the need to take action on the videos promoting suicide bomber as a career of choice for a devout muslim girl.
The discussion on over- and under-blocking was most informative but there was unanimity on the need for a follow up event to discuss other ways forward. The DPA is now working on this. Those who are serious about helping balance Internet safety and freedom should join and help with the planning – not just the subsequent debate.
DISCUSSION MEETING: RECENT DEVELOPMENTS IN CHILD INTERNET SAFETY
Children are increasingly subject to unsuitable material on the open web ranging from pornography, through suicide forums and terrorist grooming sites to bullying and blackmail over social networks.
They are active targets for paedophiles operating anywhere in the world. Material can be legal in other parts of the world but not the UK. The exchange of illegal material, including child abuse images, is increasingly across closed and peer-to-peer networks. Children often use these to download pirated films/music and thus become familiar with the dark side of the Internet
There has been an explosion in the scale of web traffic and a proliferation of sites hosting child-inappropriate material. There are currently 220 million hits a month in the UK alone on the main soft and hard pornography web sites. Children are accessing the Internet at an increasingly younger age – with a high proportion (37%) of 3-4 year olds regularly going online.
Tackling the Issue:
Following a Prime Ministerial initiative in July and November 2013, government called upon Internet Service Providers, social media companies, and Wi-Fi operators to make access to unsuitable sites via the open web harder. It has also stimulated proactive initiatives to tackle illegal content and perpetrators on the closed web and peer-to-peer sites.
As a result this year 20m homes will have internet filtering applied as default by the four leading Internet Service Providers, 90% of all public Wi-Fi networks will have family friendly filters introduced and since November, Google has blocked over 100,000 child sexual abuse related search terms. The UK is the first country to adopt such widespread measures so in some ways this is a massive social experiment.
The discussion looked at the practical implications and overall effectiveness of filtering and blocking initiatives. Filtering was criticised for being over effective – blocking legitimate and useful content such as LGBT (Lesbian, Gay, Bi-Sexual and Transvestite) community resources, sexual health and education pages, or restricting access to legitimate commercial sites and services. Filtering was also criticised for being under-effective because its restrictions are often easily circumvented by computer literate children. Furthermore, given that filtering does not prevent online bullying, self harm or suicide, terms such as “one click to safety” are misleading, giving parents a false sense of security.
There is a need for education to ensure widespread parental understanding of the dangers and the signs to look for. However there are no universally agreed benchmarks or criteria as to who decides what is and what is not suitable. Views on what constitutes appropriate/inappropriate content for children of different ages can vary substantially from household to household.
Effective age verification was identified as a key factor in restricting access by children to unsuitable sites. The gambling and the licensed pornography broadcast industries employ robust age verification systems. There was a call to adopt these methods more widely. On the other hand these are regarded by other industry sectors as too expensive and complicated to implement.
One third of unsuitable child imagery is created by children themselves and predators who exploit such images for extortion are not subject to the laws of blackmail because of the limitations of the 1968 Theft Act Section 21. Other instances of narrow legal definitions preventing action against abuse were cited.
The Digital Policy Alliance will follow up on these issues in further meetings.
P.S. I should add that I did some checking after the event and discovered that those who claim that age verification is too expensive and complex for widespread use are usually unaware of how cheap and comprehensive it now is. I was surprised to discover that some services charge under £1 (after high volume discounts) to do a more thorough check on first time visitors than most banks to do on new customers.
The “real” reasons for not not doing so appear to be:
1) Fear that checking will remove the “innocent carrier” defence (under the e-Commerce Directive and other legislation).
but perhaps more significantly
2) It destroys the “drive-by click per view” advertising revenues on which “free” porn (and many other) services now depend – because the checking routines strip away anonymity.