
alexskopje - stock.adobe.com
Chat Control: EU to decide on requirement for tech firms to scan encrypted messages
Law enforcement and police experts meet on Friday to decide on proposals to require technology companies to scan encrypted messages for possible child abuse images amid growing opposition from security experts
Law enforcement experts and policymakers are due to meet on 12 September to decide on proposals to require technology companies, such as Signal and WhatsApp, to scan all encrypted messages and communications before they are transmitted.
The Danish presidency of the EU Council is pushing for a vote on the proposals, dubbed “Chat Control” – which advocate mass scanning of mobile phones and computers to identify suspected child abuse material sent by encrypted communications services used by the pubic – by 14 October.
More than 500 cryptographers and security researchers signed an open letter on 9 September, warning that the proposals are technically unfeasible and would “completely undermine” the security and privacy of all European citizens by creating vulnerabilities that could be exploited by hackers and hostile nation states.
The encrypted messaging service WhatsApp is among the technology companies to have raised concerns about the European Union’s (EU) draft proposals.
“The latest proposal from the presidency of the Council of the EU breaks end-to-end encryption and puts everyone’s privacy, freedom and digital security at risk,” a spokesperson told Computer Weekly.
Denmark’s compromise
The European Commission first put forward proposals to mandate tech companies to scan emails and messages for potential child abuse content in 2022, but the plans were put on hold after they were blocked by a minority of member states amid concerns the proposals would damage the security and privacy of EU citizens.
The Danish presidency proposed a compromise in July 2025, which sought to strike a balance between maintaining the security of encrypted communications services and identifying potentially illegal content.
The Danish draft asserts that nothing in the proposed regulation should be “interpreted as prohibiting, weakening or circumventing” encryption, and expressly permits technology companies to continue to offer end-to-end encrypted services.
But it also requires technology companies to introduce “vetted technologies” on phones and computers to scan messages for images, videos or URLs that could be associated with known child abuse content before they are encrypted and transmitted.
Tech companies will also be required to deploy artificial intelligence (AI) and machine learning algorithms to detect previously unknown abuse images.
As of 10 September, some 15 member states supported the Danish proposals, with six member states undecided and six in opposition.
Dissenters include Belgium, Poland, Finland and the Czech Republic, which have raised concerns about the mass surveillance of citizens’ communications.
Supporters include France, Italy, Spain and Sweden. Germany is as yet undecided. Each member state receives votes based on the number of representatives it has, with large countries having more sway over the final decision.
What does Denmark’s compromise agreement say about encryption?
- Publicly available messaging services using end-to-end encryption will be required to detect abuse material before it is transmitted.
- Providers should remain free to offer services using end-to-end encryption and should not be obliged to decrypt data or create access to end-to-end encrypted data.
- Users of encrypted services will be asked to consent to have images, videos and URLs they send through an end-to-end encrypted service monitored.
- Users who do not consent may be able to send messages that do not include images, videos or URLs using other functions of the messaging service.
- Detection technologies for end-to-end encrypted services will be certified and tested by an EU centre to verify that their use cannot lead to a weakening of the protection provided by encryption.
- The EU Commission will have powers to approve detection technologies.
- Providers of detection services should have human oversight to reduce false positives and false negatives.
- Detection technologies must not “introduce cyber security risks for which it is not possible to take any effective measures to mitigate such risk”.
Source: Draft proposal
Suspicionless mass surveillance
Opponents claim that Chat Control effectively introduces “suspicionless” mass surveillance for hundreds of millions of Europeans.
In their open letter this week, cryptographers and security researchers warned that on-device detection, also known as client-side scanning, “inherently undermines the protections” of end-to-end encryption without any guarantee that it would improve protection for children.
The detection mechanism would become a high-value target for hackers and hostile nation states, which could reconfigure it to target other types of data, such as people’s financial or political interests, they said.
It would also undermine the security of encrypted messaging apps, such as WhatsApp and Signal, which are used by politicians, journalists, human rights workers, EU civil servants and law enforcement officers, as well as ordinary citizens, the letter stated.
The new proposals “unequivocally violate” the principles of end-to-end encryption and will weaken its protection, “threatening the public’s right to privacy,” the scientists warned, arguing there could be potentially serious consequences for democracy and national security.
Once introduced, scanning technology could be repurposed by less democratic regimes to monitor dissidents and opponents, or to censor communications, the security researchers claimed.
“The new proposals, similar to its predecessors, will create unprecedented capabilities for surveillance, control and censorship, and have an inherent risk for function creep by less democratic regimes,” they added.
Risk of people being wrongly targeted
The Danish proposals could put large numbers of innocent people at risk of investigation for sending images wrongly identified as suspicious, the security researchers, representing 30 countries, warned.
“Existing research confirms that state-of-the-art detectors would yield unacceptably high false positive and false negative rates, making them unsuitable for large-scale detection campaigns at the scale of hundreds of millions of users,” the letter stated.
Proposals for Chat Control to use AI and machine learning to identify unknown abuse images are also flawed, the scientists claimed, as “there is no known machine-learning algorithm that can identify illegal images without making large numbers of mistakes”.
Encrypted messaging services react
German encrypted email provider Tuta Mail said that if the EU’s Chat Control proposals are adopted, it would take legal action against the EU rather than betray its users by introducing backdoors into its encrypted messaging service.
CEO Matthias Pfau said the proposals would undermine trust in European technology. “By forcing providers to break encryption and enable mass surveillance, the EU would kill trust in European products and drive users to foreign tech giants,” he added.
Alexander Linton, president of the Session Technology Foundation, another encrypted messaging service, said it was not possible to introduce scanning without creating new security risks.
The Danish proposal states that scanning technologies that introduce security risks that cannot be mitigated should not be used, but Linton said this was not technically possible.
“None of the technologies available achieve this standard – all client-side scanning technologies introduce new unmitigable risks,” he added.
Backdoors could be used by bad actors
Matthew Hodgson, CEO of Element, a secure communications platform used by European governments, said the proposed Chat Control regulation was fundamentally flawed and would put the privacy and data of 450 million citizens at risk.
“Undermining encryption by introducing a backdoor for lawful intercept is nothing other than deliberately introducing a vulnerability, and they always get exploited in the end,” he added.
A years-long Chinese hacking operation, dubbed Salt Typhoon, used law enforcement backdoors in the US public telephone network to access call records and unencrypted communications of US citizens.
“The US is still urging its citizens into end-to-end encrypted systems as a result,” Hodgson told Computer Weekly.
Signal warned last year that it would pull its messaging service out of the European Union rather than undermine its privacy guarantees.
Callum Voge, director for government affairs and advocacy at the Internet Society, a non-profit organisation, said client-side scanning created opportunities for bad actors to reverse engineer and corrupt scanning databases on devices.
“If breaking encryption is like having the envelope ripped open while a letter goes through the Post Office, client-side scanning would be like someone reading over your shoulder as you write the letter,” he told Computer Weekly.
He said that even if AI scanning were 99.5% effective at identifying abuse, it would lead to billions of wrong identifications every day.
“That is a huge number that could overwhelm the system, but also lead to innocent people incorrectly being labelled as sharing illegal child abuse material,” he added.
No ‘technical fix’
The scientists argue that, rather than relying on a “technical fix”, governments should invest in education, reporting hotlines and other proven techniques for tackling abuse.
Voge told Computer Weekly that policymakers should prioritise approaches that protect children but also foster the open and trusted internet.
“That means more resources spent on targeted approaches – things like court-authorised investigations, metadata analysis, cross-border cooperation, support for victims, prevention and media literacy training,” he added.
Apple dropped its own plans to introduce client-side scanning to detect child abuse on the iPhone after the world’s top scientists published a paper that found the supplier’s attempts would not be effective against crime or protect against surveillance.
Cryptowars: Read more about the debate on encryption
- Crime agency criticises Meta as European police chiefs call for curbs on end-to-end encryption.
- Ofcom will consult on standards to enforce new powers, but tech companies remain concerned about the impact of the bill’s ‘spy clause’, which could require them to scan encrypted messages.
- Technology companies say reassurances by government ministers that they have no intention of weakening end-to-end encrypted communication services do not go far enough.
- BCS, The Chartered Institute for IT, argues the government is seeking a technical fix to terrorism and child abuse without understanding the risks and implications.
- Government boosts protection for encryption in Online Safety Bill but civil society groups remain concerned.
- CEO of encrypted messaging service Element says Online Safety Bill could pose a risk to the encrypted comms systems used by Ukraine.
- Tech companies and NGOs urge rewrite of Online Safety Bill to protect encrypted comms.
- Protecting children by scanning encrypted messages is ‘magical thinking’, says Cambridge professor.
- Proposals for scanning encrypted messages should be cut from Online Safety Bill, say researchers.
- GCHQ experts back scanning of encrypted phone messages to fight child abuse.
- Tech companies face pressure over end-to-end encryption in Online Safety Bill.
- EU plans to police child abuse raise fresh fears over encryption and privacy rights.
- IT professionals wary of government campaign to limit end-to-end encryption.
- John Carr, a child safety campaigner backing a government-funded campaign on the dangers of end-to-end encryption to children, says tech companies have no choice but to act.
- Information commissioner criticises government-backed campaign to delay end-to-end encryption.
- Government puts Facebook under pressure to stop end-to-end encryption over child abuse risk.
- Former UK cyber security chief says UK government must explain how it can access encrypted communications without damaging cyber security and weakening privacy.
- Barnardo’s and other charities begin a government-backed PR campaign to warn of dangers end-to-end encryption poses to child safety. The campaign has been criticised as ‘one-sided’.
- Apple’s plan to automatically scan photos to detect child abuse would unduly risk the privacy and security of law-abiding citizens and could open up the way to surveillance, say cryptographic experts.
- Firms working on UK government’s Safety Tech Challenge suggest scanning content before encryption will help prevent the spread of child sexual abuse material – but privacy concerns remain.
- Private messaging is the front line of abuse, yet E2EE in its current form risks engineering away the ability of firms to detect and disrupt it where it is most prevalent, claims NSPCC.
- Proposals by European Commission to search for illegal material could mean the end of private messaging and emails, says MEP.
Read more on IT risk management
-
Europol seeks evidence of encryption on crime enforcement as it steps-up pressure on Big Tech
-
Crime agency criticises Meta as European police chiefs call for curbs on end-to-end encryption
-
Chat control: Tech companies warn ministers over EU encryption plans
-
Parliament passes sweeping Online Safety Bill but tech companies still concerned over encryption