Kenjo -

European ‘chat control’ plans in the name of ‘child safety’ threaten end-to-end encryption

Proposals by European Commission to search for illegal material could mean the end of private messaging and emails

Are our online communications secure? Who has access to them? Science fiction movies often explore a reality in which some kind of Big Brother system monitors our every action. However, this does not realistically depict the present – not yet, anyway.

By using messenger services that encrypt our texts, images and videos from sender to recipient – end-to-end encryption – their content cannot be intercepted in transit. The keys needed to make sense of the encrypted data are stored only on the devices communicating with each other, allowing only the intended recipients to see the content. This is setting aside revealing metadata, as well as the option of hacking the users’ devices or accessing cloud storage which may include decrypted messages.

But the European Commission (EC) is currently preparing legislation intended to curb the spread of child sexual exploitation material. The legislation – dubbed “chat control” – will mandate the automatic searching of all personal electronic mail and messages of every citizen for presumed suspect content in the search for child pornography. Suspected cases will be notified to the police. 

So far, only US communications services such as Facebook Messenger, Google Gmail or Microsoft Outlook are voluntarily using such general monitoring. According to the Swiss federal police, in the vast majority of cases (86%), innocent citizens come under suspicion of having committed an offence due to the unreliable processes used. For example, harmless family beach photos or consensual sexting may be reported.

Still, the EC is looking into client-side scanning (CSS) as a possible method to screen even end-to-end encrypted messages for suspicious content. This would require the messaging app (WhatsApp or Signal, for example) to create a hash value (digital fingerprint) of the content to be sent, which would then be compared against a database of allegedly illegal content. If the algorithm reports a hit, the message would not be sent and would be reported to law enforcement authorities.

Client-side scanning

IT security experts warn against client-side scanning for several reasons.

First, to check whether the content flagged by the algorithm is actually prohibited, a manual review would need to take place. This requires a technical feature allowing third parties to check the content of the normally encrypted communications – a backdoor.

Backdoors fundamentally jeopardise the security of end-to-end encryption, due to external adversaries such as intelligence services or criminal hackers being able to find and abuse these vulnerabilities. Nor is private correspondence – especially nude images – safe in the hands of the provider or the authorities, as reports of misuse of intimate data by US agencies as well as big tech companies have demonstrated.

Most importantly, the scope of client-side scanning methods can easily be extended to screen correspondence for other purposes, such as targeted advertising, the sharing of legitimate content or blocking of political communications.

Bearing in mind that no judicial order would be required to authorise the monitoring, the privacy of citizens would be in the hands of closed source algorithms and hash databases controlled by global tech companies. Whoever controls the database of hashes would be able to intercept any content of interest.

False positives

The scope of the proposed chat control legislation has already been extended beyond pictures and videos to search text messages for possible “child grooming” attempts. We can expect the rate of false positives to be staggering.

A former judge of the European Court of Justice, Ninon Colneric, concluded that the proposed chat control legislation on indiscriminately and permanently screening all private communications would violate the fundamental rights of EU citizens. Yet the European Commission is determined to propose such legislation.

If the EU mandates backdoors in end-to-end encrypted messaging clients to scan for suspicious content, it is only a small further step to mandate such backdoors for law enforcement interception. This would break end-to-end encryption altogether and expose personal, business and state secrets to foreign intelligence services and hackers.

Destroying safe communications channels would endanger whistleblowers and risk the life of dissidents in dictatorial regimes such as Hong Kong and Belarus. It is no surprise, therefore, that the “Five Eyes” intelligence alliance advocates chat control legislation with a view to undermining encryption.

Targeted police work

Protecting children is undoubtedly a pressing issue. However, this cannot and should not be done by sacrificing the secrecy of electronic communications. In fact, indiscriminate monitoring destroys safe spaces for abuse victims to receive counselling and disproportionately targets minors themselves, while serious criminals just continue to use self-operated end-to-end encrypted systems.

The shutdown of the “Boystown” child porn platform earlier this month demonstrated that targeted police work is key to prosecuting the organised structures behind this horrendous crime.

Instead of unleashing an unprecedented mass surveillance system on all of us, children need to be better protected by educating the public, offering more therapy and support, and by reducing the backlogs of criminal investigators.


Read more on Data protection regulations and compliance

Data Center
Data Management