kentoh - Fotolia

Tech companies should not be under legal duty to remove terrorist material, says watchdog

The internet should not be a safe place for terrorists, but making it compulsory for technology companies to trawl for radical content risks interfering with the rights of innocent people, says Max Hill QC, the independent reviewer of terrorism legislation

The UK should not follow the lead of other countries by introducing laws to force technology companies to remove extreme terrorist material from social media sites, says the UK’s independent reviewer of terrorism legislation.

Max Hill QC, speaking in his final week as terrorism watchdog, said the internet should not be a “safe place” for terrorists, but placing a statutory duty on technology companies to trawl for radical content might interfere with the rights of innocent people.

It was not always easy to distinguish between terrorist material presented in a religious context and people distributing mainstream religious texts, “which we should interfere with at our absolute peril”, he said.

Hill was speaking at a meeting of the Human Rights Lawyers Association on 10 October – the day his final report on the operation of UK’s terrorism legislation was published.

Placing legal obligations on technology companies to prevent extreme content appearing online might catch a lot of extremist material, he said, “but it would also catch a lot of other innocent and certainly non-terrorist activity, and that is something we need to avoid”.

There is a danger of driving offensive terrorist material onto the “dark web”, where it would be harder for law enforcement agencies to detect.

Jail sentence for viewing terrorist material online

Hill’s report raised concerns about parts of the Counter Terrorism and Border Security Bill 2018 – currently going through Parliament – which could have unintended consequences.

Clause 3 of the draft bill introduces a new offence of viewing terrorist material online, punishable by up to 15 years in prison. This would put people at potential risk even if they were not collecting information for a terrorist purpose.

Until now, it has been an offence to download terrorist material, such as videos or magazines, but it is not an offence to view it without downloading.

Hill said “real care” was needed in the new legislation to ensure “we are not interceding against people when they have done nothing more than think”.

He said he was troubled by clause 3 – the “so-called three clicks offence” – which is designed to establish a pattern of behaviour of viewing terrorist material.

A few weeks ago, the government – in response to the committee stage in the commons – said it wouldn’t go with three clicks, as one would be enough. “To which my question is, ‘How can you establish a pattern of behaviour with a single click?’ I don’t think you can,” said Hill.

Read more about terrorism, technology and the law

The government’s argument is there is the reasonable excuse defence, and there is prosecutorial discretion. However, Hill said it would be better to “take clause 3 away and try again”.

“To intercede and place someone on trial when they haven’t even downloaded, when it’s one click, rings all sorts of alarm bells, in terms of legitimate research, legitimate use, legitimate freedoms,” he said.

“We may not like the outer limits of free speech, we may not like what people are saying. We intercede in any number of ways, but we don’t prosecute,” he said.

Internet companies needed wake-up call

When Hill first took up the post of independent reviewer – a role which has overseen the Terrorism Act 2000Terrorism Act 2006Terrorism Prevention and Investigation Measures Act 2011 (TPIM)Terrorist Asset-Freezing etc Act 2010 (TAFA) – in March 2017, the big five internet companies were not doing enough to tackle online terrorist material and needed a “wake-up” call, he said.

The former home secretary, Amber Rudd, has worked with industry groups, including the Global Internet Forum on Counter Terrorism (GIFCT), to encourage their use of technology to identify and take down extremist content from the internet. 

The forum, which was founded by Facebook, Microsoft, Twitter and YouTube in June 2018, is creating databanks of extreme material.

“The process of databanking extreme material so that everyone can take it down is clearly what is needed,” he said. “The trouble is the internet is so huge, it takes time to databank.”

“The process of databanking extreme material so that everyone can take it down is clearly what is needed. The trouble is the internet is so huge, it takes time to databank”
Max Hill, terrorism watchdog

The Counter Terrorism Internet Referral Unit (CTIRU), part of the Metropolitan Police, which provides intelligence to police forces across the UK, has removed more than 300,000 items since it was established in October 2017.

“I have sat side by side with those officers. It is amazing that they spend day in, day out surfing the web, looking for extreme content, issuing take-down requests,” he said.

Internet companies have been slow to respond to take-down requests in the past, but Hill said that when he last looked at the issue four or five months ago, internet companies were taking down terrorist material within 45 minutes of a request.

“The process of databanking worldwide will help. More cooperation with law enforcement, such as CTIRU, will help, and education will help, together with robust efforts to place a counter-narrative alongside this viral narrative,” he said.

Although the draft Counter Terrorism and Border Security Bill has its problems, the government has not come forward with “swingeing actions” targeted at internet companies.

“Not all of the Western European jurisdictions have been so restrained. In Germany, they have imposed [obligations]. I am glad we don’t here, so I give credit where credit is due,” said Hill.

He said he hoped there would be a new independent regulator of terrorism powers in place by January 2019, but it could take until February or March if the new appointee requires security vetting.

Hill said he had tried to “load all of my thoughts on the website and the reports, so there is something for people, including parliamentarians, to look at”.

He said not all of his recommendations had been accepted by government, but his job was to build up an “index of independent scrutiny” so that parliamentarians in the future can look back and reconsider.

Max Hill

Max Hill was born in Hertfordshire, and attended the Royal Grammar School in Newcastle when his family moved to the North of England.

After toying with the idea of studying history, in 1983 he won a scholarship to St Peters College Oxford to read law, propelling him to a career at the bar.

While the rest of his peers became solicitors, Hill, uniquely in his year, opted to become a barrister.

“Here we are 30 years later and I am the only one left in private practice of any sort, so I would claim I made the right decision,” he said, speaking at a meeting of the Human Rights Lawyers Association.

He has practiced a wide range of law, moving chambers twice during his first 10 years, before moving to what became Red Lion Chambers, where he specialised in prosecuting terrorism and crime cases. He became head of chambers in March 2017.

He took over from David Anderson as independent reviewer of terrorism legislation in March 2017. Within three weeks of taking up the post, the Westminster Bridge attacks took place, followed by four further terrorist attacks, culminating in Parsons Green in September 2017.

Despite the pressures of scrutinising the police work covering a series of horrific attacks, Hill does not regret taking the job when he did.

“It is the best time to do this job because it has shown the system under pressure. I have been able to at least attempt to draw out the imperfections and give praise where it’s due, for example the police investigation over a really intense period of activity,” he said.

Read more on IT legislation and regulation

Data Center
Data Management