leowolfert - Fotolia
Chat control: EU lawyers warn plans to scan encrypted messages for child abuse may be unlawful
Leaked legal advice warns that European ‘chat control’ proposals to require tech companies to scan private and encrypted messages for child abuse are likely to breach EU law
A proposed European law that could require communications companies, including WhatsApp, Signal and Facebook Messenger, to scan the contents of private and encrypted messages for child abuse material is likely to be annulled by the European Court of Justice, according to the European Union’s (EU) own internal legal advice.
The controversial EU law, known as “chat control”, will allow governments to serve “detection orders” on technology companies, requiring them to scan private emails and messages on private communication services for “indicators of child abuse”, in a move that critics say will undermine encrypted communications.
Technology companies have objected to similar UK proposals in the Online Safety Bill, and have warned that they would be forced to withdraw their services if regulators were given powers to require tech companies to place “backdoors” into encrypted messaging services.
The European Commission proposed in May last year to introduce mandatory requirements for all email, chat and messaging service providers, including those providing end-to-end encrypted communications, to scan messages for illegal child sexual abuse material (CSAM).
But leaked internal legal advice from the Council of the European Union has raised serious questions about the lawfulness of the planned “chat control” measures, which it says could lead to the de facto “permanent surveillance of all interpersonal communications”.
The document, written by the legal service of the European Commission, and seen by Computer Weekly, points out that there is a high probability that detection orders aimed at users of phone, email, messenger and chat services would constitute “general and indiscriminate” surveillance in breach of EU privacy rights.
The commission’s legal service states that the “chat control’ proposals imply that technology companies would either have to abandon effective end-to-end encryption, introduce some sort of backdoor to access encrypted content, or access content before it is encrypted by installing client-side scanning (CCS) technology on users’ phones and computers.
“It appears that the generalised screening of content of communications to detect any kind of CSAM would require de facto prohibiting, weakening or otherwise circumventing cyber security measures,” the lawyers wrote.
There is a serious risk that the proposals would compromise citizens’ rights to privacy and data protection under Articles 7 and 8 of the European Charter of Fundamental Rights, by authorising the automated surveillance of all users of specific messaging services, irrespective of whether they had any link with child sexual abuse, the document states.
The EU proposal requires tech companies to install “sufficiently reliable detection technologies”, but fails to explain what would count as “sufficiently reliable” or what error rates, such as messages wrongly identified as containing illegal content, would be acceptable.
The legal advice, dated 26 April 2023, found that according to the European Court, member states can only lawfully carry out bulk automated analysis of traffic and location data of communications services to combat serious threats to national security.
“If the screening of communications metadata was judged by the court proportionate only for the purpose of safeguarding national security, it is rather unlikely that similar screening of content of communications for the purpose of combating child abuse would be found proportionate,” the legal advice warns.
EU lawyers also warn that requirements for communications companies to introduce age verification systems “would necessarily add another layer of interference with the rights and freedoms of users”.
Age verification would have to be carried out by either mass profiling of users, biometric analysis of users’ faces or voices, or by the use of digital identification or certification systems.
Ten EU states back surveillance of end-to-end encryption
Despite the concerns raised by the European Commission’s lawyers, 10 EU countries – Belgium, Bulgaria, Cyprus, Hungary, Ireland, Italy, Latvia, Lithuania, Romania and Spain – argued in a joint position paper on 27 April 2023 that end-to-end encryption should not be excluded from the European Commission’s chat control proposal.
MEP Patrick Breyer, a member of the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs (Libe), called on the EU presidency, currently held by Switzerland, to remove blanket monitoring of private communications and age verification from the prosed legislation.
“The EU Council’s services now confirm in crystal clear words what other legal experts, human rights defenders, law enforcement officials, abuse victims and child protection organisations have been warning about for a long time: obliging email, messaging and chat providers to search all private messages for allegedly illegal material and report to the police, destroys and violates the right to confidentiality of correspondence,” he said.
“What children really need and want is a safe and empowering design of chat services, as well as Europe-wide standards for effective prevention measures, victim support, counselling and criminal investigations,” he added.
Concern over UK encryption plans
Technology companies offering encrypted messaging services urged the UK government to make urgent changes to similar legislation going through the British Parliament in an open letter in April 2023.
WhatsApp, owned by Meta, said in a statement that the bill could force technology companies to break end-to-end encryption on private messaging services, affecting the privacy of billions of people.
The letter argued that end-to-end encryption offers one of the strongest possible defences against malicious actors and hostile states, along with persistent threats from online fraud, scams and data theft.
Separately, the National Union of Journalists warned that the Online Safety Bill risks undermining the security of confidential communications between journalists and their sources.
Read more about the debate on end-to-end encryption
- Peers hear that the UK government is being deliberately ambiguous about its plans to require technology companies to scan the content of encrypted messages.
- CEO of encrypted messaging service Element says Online Safety Bill could pose a risk to the encrypted comms systems used by Ukraine.
- Tech companies and NGOs urge rewrite of Online Safety Bill to protect encrypted comms.
- Protecting children by scanning encrypted messages is ‘magical thinking’, says Cambridge professor.
- Proposals for scanning encrypted messages should be cut from Online Safety Bill, say researchers.
- GCHQ experts back scanning of encrypted phone messages to fight child abuse.
- Tech companies face pressure over end-to-end encryption in Online Safety Bill.
- EU plans to police child abuse raise fresh fears over encryption and privacy rights.
- John Carr, a child safety campaigner backing a government-funded campaign on the dangers of end-to-end encryption to children, says tech companies have no choice but to act.
- Information commissioner criticises government-backed campaign to delay end-to-end encryption.
- Government puts Facebook under pressure to stop end-to-end encryption over child abuse risk.
- Former UK cyber security chief says UK government must explain how it can access encrypted communications without damaging cyber security and weakening privacy.
- Barnardo’s and other charities begin a government-backed PR campaign to warn of dangers end-to-end encryption poses to child safety. The campaign has been criticised as ‘one-sided’.
- Apple’s plan to automatically scan photos to detect child abuse would unduly risk the privacy and security of law-abiding citizens and could open up the way to surveillance, say cryptographic experts.
- Firms working on UK government’s Safety Tech Challenge suggest scanning content before encryption will help prevent the spread of child sexual abuse material – but privacy concerns remain.
- Private messaging is the front line of abuse, yet E2EE in its current form risks engineering away the ability of firms to detect and disrupt it where it is most prevalent, claims NSPCC.
- Proposals by European Commission to search for illegal material could mean the end of private messaging and emails, says MEP.
Read more on Network security strategy
Why we need a secure side door for encrypted apps, not a back door
Government is playing ‘psychic war’ in battle over end-to-end encryption
Online Safety Bill could pose risk to encryption technology used by Ukraine
Tech companies and NGOs urge rewrite of Online Safety Bill to protect encrypted comms