alexskopje - stock.adobe.com
Technology companies should introduce measures to protect children from online abuse before they are compelled to do so by law, an expert on child safety warned last night.
John Carr, secretary of the UK Children’s Charities’ Coalition on Internet Safety and an online safety campaigner who is backing a government campaign warning of the dangers of encryption, said tech companies would eventually be compelled by law to introduce technology to identify child abuse.
His comments came as the government expanded the draft Online Safety Bill to make tech companies legally responsible for proactively policing posts that include revenge porn, hate crime, fraud, the sale of illegal drugs or weapons, the promotion or facilitation of suicide, people smuggling, and sexual exploitation.
The bill gives communications regulator Ofcom powers to issue “technology notices” that require social media companies to install “accredited equipment” to identify child abuse and terrorist content that could sent through, for example, encrypted messages.
But Carr, who is backing the government-funded campaign No Place to Hide, which is putting pressure on social media companies, particularly Facebook, to delay the introduction of end-to-end encrypted (E2EE) messaging services, said that although there was no guarantee that the law would be passed, companies would eventually face legal compulsion to protect children.
Tech companies will be ‘compelled to protect children’
“The Online Safety Bill has not yet been presented to Parliament, we do not know when it will pass or, if there is an early General Election, whether it will pass at all any time soon,” Carr told Computer Weekly.
“Politics is an uncertain world, whereas companies can act now, if they choose to. It would be a great shame if we all had to wait until they are compelled to protect children, which eventually they will be.”
Carr, an expert adviser on online child safety, said end-to-end encryption on social networks, particularly Facebook, posed particular risks to children because it allowed abusers to identify and contact children without knowing their telephone number.
“Facebook currently makes 94% of all reports of suspected online child abuse, therefore if they stop doing this, it will of course have a huge impact on the ability to identify and stop child sexual abuse,” he said. “There are estimates that if Facebook proceeds as planned, 14 million reports of suspected child sex abuse online will be lost.”
Signal, WhatsApp and Telegram
Carr said it was impossible to know whether other encrypted messaging services, such as Signal, WhatsApp and Telegram, were being used for child abuse.
“There is no way of knowing what’s even at the top of the iceberg, precisely because these platforms cannot detect child sex abuse material being shared,” he said.
The campaigner cited Hany Farid, a computer scientist at the University of California, Berkeley, who helped to develop Microsoft’s PhotoDNA technology, which is able to identify known child sexual abuse photographs from a database of hashes.
Farid argued in an op-ed in Wired magazine in 2019 that PhotoDNA and similar technologies could be used in conjunction with specialist encryption algorithms to match photographs in encrypted data.
“This analysis provides no information about an image’s contents, preserving privacy, unless it is a known image of child sexual abuse,” he wrote.
Carr said it “simply is not true” that it is not possible to scan messages before they are encrypted without weakening the security of encryption.
Online Safety Bill will lead to ‘security backdoors’
However, a paper produced by the Internet Society last month argued that the consensus among technical experts is that there are currently no technical solutions that provide access to private communications without weakening security.
“The creation of a backdoor for law enforcement access also creates a common gateway that criminals and hostile state actors can use,” it said.
If the draft Online Safety Bill is implemented in its current form, providers would face the “impossible task” of creating backdoors to encryption while attempting to keep them secure from hackers, said the study.
“Providers would likely need to have encryption engineers on constant standby to respond to attacks that will occur due to vulnerabilities created by the backdoor,” it added.
The implementation of the bill could also motivate developers to design algorithms that they can easily weaken to comply with the Online Safety Bill, according to the Internet Society study.
This could open up communications systems to vulnerabilities that could be attacked by third parties.
For example, in 2015, Juniper Networks announced the discovery of an unauthorised backdoor that had allowed third parties to decrypt data passing through its systems for three years.
Technical experts attributed the fault to Juniper’s use of the Dual_EC encryption algorithm that had been re-engineered to give the US National Security Agency (NSA) “exceptional access” to encrypted communications.
Robin Wilton, director of internet trust at the Internet Society, said that although the draft Online Safety Bill does not mention encryption directly, if passed, it could lead to tech companies withdrawing secure encrypted services from the UK market.
The government introduced additional measures in the Online Safety Bill this week requiring tech companies to proactively police revenge porn, hate crime, fraud, the sale of illegal weapons, the promotion of suicide and sexual exploitation.
“This is essentially a colossal exercise in ‘scope creep’,” he told Computer Weekly. “Having used child abuse as the ‘thin end of the wedge’ to open the door to digital surveillance, the Home Office is now loading the bill with every other offence it tried to use before.”
Wilton said that if the bill is implemented, UK products and services marketed in the UK will be seen as untrustworthy. “The economy will be hit hard,” he added.
Encryption not ‘a binary choice’
A steering group of charities, led by Barnardo’s, the Lucy Faithful Foundation, the Marie Collins Foundation and SafeToNet, is driving the work of the government-sponsored No Place to Hide campaign.
Carr said: “We want tech companies to make a commitment that they will not roll out E2EE without the technology in place to ensure that children will not be put at greater risk as a result.
“That means working constructively with cyber experts, children’s charities and survivors to find a solution that ensures strong user privacy without putting children at great risk. We want them to work with us constructively, rather than positioning this as a binary choice.”
The government announced plans this week to include provisions in the draft Online Safety Bill to legally require pornography sites available in the UK to verify that their users are over 18 years of age.
Read about the debate on end-to-end encryption
- Information Commissioner criticises government-backed campaign to delay end-to-end encryption.
- Government puts Facebook under pressure to stop end-to-end encryption over child abuse risk.
- Ciaran Martin, the former UK cyber security chief, says the government must explain how it can access encrypted communications without damaging cyber security and weakening privacy.
- Barnardo’s and other charities begin a government-backed PR campaign to warn of the dangers end-to-end encryption poses to child safety. The campaign has been criticised as ‘one-sided’.
- Apple’s plan to automatically scan photos to detect child abuse would unduly risk the privacy and security of law-abiding citizens and could open up the way to surveillance, say the world’s top cryptographic experts.
- Firms working on the UK government’s Safety Tech Challenge have suggested that scanning content before encryption will help prevent the spread of child sexual abuse material – but privacy concerns remain.
- Private messaging is the front line of abuse, yet E2EE in its current form risks engineering away the ability of firms to detect and disrupt it where it is most prevalent, writes children’s charity the NSPCC.
- Proposals by European Commission to search for illegal material could mean the end of private messaging and emails, writes Patrick Breyer MEP.