_KUBE_ - stock.adobe.com

Privacy will be under unprecedented attack in 2026

The UK and Europe are ramping up opposition to encryption and stepping up surveillance of private communications. Here is what to expect this year

The privacy of electronic communications will face new risks in 2026, as the UK and other governments push for greater capabilities to harvest and analyse more data on private citizens, and to make it harder to protect communications with end-to-end encryption.

Over the next 12 months, we can expect more pressure from the UK and Europe to restrict the unencumbered use of end-to-end encrypted email and messaging services such as Signal, WhatsApp and many others.

In the 1990s, the US government tried and ultimately failed to persuade telecommunications companies to install a device known as the Clipper chip to provide the US National Security Agency (NSA) with “backdoor” access to voice and data communications.

The Crypto wars of 2026 are more subtle, with controls and restrictions on encryption pushed by governments, law enforcement agencies and intelligence services as a means of detecting child sexual abuse and terrorist material being promulgated through encrypted email and messaging systems.

The answer governments are settling on is to encourage the use of scanning technology in a voluntary or compulsory way, to identify problematic content before it is encrypted.

Cryptographers and computer scientists have repeatedly warned that such plans will create security vulnerabilities that will leave the public less safe than before.

Chat Control and client-side scanning

The European Parliament and Council are expected to adopt the controversial Child Sexual Abuse Regulation (CSAR) in spring 2026. In its current form, it proposes that messaging platforms voluntarily scan private communications for offending content, combined with proposals for age verification to check the age of users.

Known by the nickname Chat Control, its critics – such as former MEP Patrick Breyer, a jurist and digital rights activist – claim the regulation will open the doors to “warrantless and error-prone” mass surveillance of European Union (EU) citizens by US technology companies. The algorithms, say critics, are notoriously unreliable, potentially exposing tens of thousands of legal private chats to police scrutiny.

Chat Control will also put pressure on technology companies to introduce age checks to help them “reliably identify minors”, a move that would likely require every citizen to upload an ID or take a face scan to open an account on an email or messaging service. According to Breyer, this creates a de facto ban on anonymous communication, putting whistleblowers, journalists and political activists who rely on anonymity at risk.

Online Safety Act

In the UK, there remain concerns about provisions in the Online Safety Act that, if implemented by regulator Ofcom, would require technology companies to scan encrypted messages and emails.

These powers attracted widespread criticism from technology companies as the bill passed into law, with Signal warning it would pull its encrypted messaging service from the UK if it was forced to introduce what it called a “backdoor”.

Commentators think there is little current appetite for Ofcom to mandate client-side scanning for private communications, given the level of opposition.

But it may require providers of public and semi-public services, such as cloud storage, to introduce scanning services to detect illegal content.

“I think they may be waiting to see what happens in Europe with the Chat Control proposal, because it’s quite hard for the UK to go alone,” James Baker, campaigner at the Open Rights Group, told Computer Weekly.

Perceptual hash matching

One of the items on Ofcom’s agenda is a form of scanning, known as perceptual hash matching, which uses an algorithm to decide whether images or videos are similar to known child abuse or terrorism images.

A consultation document from Ofcom proposes requiring tech platforms that allow users to upload or share photographs, images and videos – including file storage and sharing services, and social media companies – to introduce the technology for detecting terrorism and abuse-related material.

“We also think some services should go further – assessing the role that automated tools can play in detecting a wider range of content, including child abuse material, fraudulent content, and content promoting suicide and self-harm, and implementing new technology where it is available and effective,” it says in its consultation document.

But there are questions about the accuracy of perceptual hash matching, and the risk that its use may lead to people wrongly being barred from online services for alleged crimes they have not committed.

Critics point out that perceptual hash matching used to be called “fuzzy matching” – and for good reason. Although its new name, “perceptual hash matching”, gives the impression of precision and predictability, in reality, it produces false positives and negatives.

Hundreds of people have been blocked from Instagram, owned by Meta, after being wrongly accused of breaching Meta’s policies on child sexual exploitation and abuse. The company’s actions took a huge emotional toll on the people affected, and in some cases led to people losing their online businesses, the BBC reported in October 2025.

Alec Muffett, security expert and former Facebook engineer, told Computer Weekly that Ofcom’s proposals display “a horrifying lack of safety by design” and said its proposal to force companies to adopt the technology without mitigating the potential risks is “derelict”.

“Perceptual hashing is just a fancy name for what we used to call ‘fuzzy matching’ with ‘digital fingerprints’, and even if we ignore the problem of false positives, we are left with the risk of creating an enormous cloud surveillance engine by logging all queries for even benign digital fingerprints,” he said.

Encryption apps viewed as national security risk

There are signs of increasing government discomfort with encrypted communications. In December 2025, the Independent Reviewer of State Threats Legislation delivered a stark warning that developers of encryption technology could be subject to police stops, detention and questioning, and the seizure of their electronic devices under national security laws.

According to Jonathan Hall KC, the developer of an app whose selling point is that it offers end-to-end encryption, could be considered to be unwittingly engaged in “hostile activity” under Section 3 of the Counterterrorism and Border Security Act 2019.

“It is a reasonable assumption that [the development of the app] would be in the interests of a foreign state even if the foreign state has never contemplated this potential advantage,” he wrote.

Digital ID all over again

The UK’s proposals for a mandatory digital ID scheme look set to be another battleground for privacy in 2026. The government says the scheme will help to crack down on illegal immigration by introducing mandatory “right to work” checks by the end of the Parliamentary term.

MPs were scathing when the bill was introduced in Parliament. “The real fear here is that we will be building an infrastructure that can follow us, link our most sensitive information and expand state control over all our lives,” said Rebecca Long-Bailey during the debate. Others raised concerns about the cyber security risks of storing details of the population on a central government database.

Gus Hosein, executive director of campaign group Privacy International, notes that the Home Office is repeating the same arguments originally put forward in 2023 when Tony Blair attempted to introduce a national identity card. The scheme was scrapped by the Conservative and Liberal Democrat coalition in 2010. “It’s just the same boring rhetoric: ‘It’s going to stop ID fraud, it’s going to stop terrorism, it’s going to stop migration problems,’” he said. “Do we really have to go through the whole process of debunking this again?”

Hosein said the prospects of the Home Office coming up with a workable system before the next election are low. The political climate is different this time. Nearly three million people have signed a Parliamentary petition calling for the idea to be scrapped. “If they try and do the classic thing, which is to try and build something grand and momentous, it will take forever,” he said. “I would not mind an ID system that actually worked, I just don’t want the Home Office within 10,000 miles of it.”

When combined with facial recognition, digital ID raises further privacy issues. Campaign groups are expected to bring a legal challenge in 2026 after Freedom of Information Act requests revealed that the government covertly allowed police forces to search 150 million UK passport and immigration database photos for matches of images captured by facial recognition technology.

Big Brother Watch and Privacy International have issued legal letters before action to the Home Office and the Metropolitan Police. They argue that there is no clear legal basis for the practice and that the Home Office has kept the public and Parliament in the dark.

“There is a risk when you roll out digital facial recognition cameras that the images used for digital ID will be used to track you around town centres,” said the Open Rights Group’s Baker.

Apple backdoors and technical capability notices

This year will see further legal challenges at the Investigatory Powers Tribunal against the Home Office’s secret order issued against Apple, requiring it to facilitate access for law enforcement and intelligence agencies to encrypted data stored by Apple’s customers on iCloud.

Scheduled for the spring, the case brought by Privacy International and Liberty will challenge the lawfulness of the Home Office using a technical capability notice (TCN) to require Apple to disclose the encrypted data of users of its Advanced Data Protection (ADP) service worldwide.

Apple is expected to issue a new legal challenge after the UK government abandoned its original wide-ranging TCN and replaced it with an order focused on providing access only to ADP users in the UK, ending Apple’s legal challenge, at least for now.

The case has the potential to turn into a mammoth battle, reaching the Supreme Court and the European Court of Human Rights.

Surveillance of journalists

This year will also see further legal challenges that will test the boundaries between state intrusion and the professional privileges accorded to lawyers and journalists to protect the confidentiality of their clients or journalistic information.

The Investigatory Powers Tribunal is due to decide on a case brought by the BBC and former BBC journalist Vincent Kearney against the Police Service of Northern Ireland and the Security Service, MI5.

The Security Service broke with the conventions of Neither Disclose Nor Deny (NCND) to acknowledge to the tribunal that it had unlawfully obtained phone communications data from Kearney in 2006 and 2009, while he was working at the BBC, in an attempt to identify his confidential sources.

Although MI5 followed the Communications Data code of practice at the time, the code did not meet the strict legal tests for accessing journalistic material, which is protected under the European Convention of Human Rights.

In a judgment just before Christmas, the IPT rejected arguments that MI5 should disclose further details of surveillance operations against Kearney and other BBC journalists, including operations that had proper legal approval. The IPT will decide what remedy is due in 2026, and whether Kearney and the BBC should receive compensation.

Another legal case will test the boundaries between police surveillance and the legal protection given to lawyers to protect the confidentiality of discussions with their clients when subject to police stops.

Fahad Ansari, a lawyer who acted for Hamas in an attempt to overturn its proscription as a terrorist organisation in the UK, had his mobile phone seized by police after he was detained under Schedule 7 of the Terrorism Act 2000 at a ferry port, after returning from a family holiday.

The case is believed to be the first targeted use of Schedule 7 powers – which allow police to stop and question people and seize their electronic devices without the need for suspicion – against a practising solicitor.

Ansari is seeking a judicial review to challenge the right of police to examine the contents of his phone, which contains confidential and legally privileged material from his clients, accumulated over 15 years.

The legal fallout from EncroChat and SkyECC

The legal fallout from an international police operation to hack encrypted phone network Sky ECC and EncroChat more than five years ago will continue.

French police led operations to harvest tens of millions of encrypted messages used as evidence of criminality to bring prosecutions against drug gangs across Europe and the UK.

Defence lawyers and forensic experts have raised questions about the reliability of the evidence supplied by the French to the UK and EU states through Europol.

France has declared the hacking operation against EncroChat and Sky ECC a state secret and refused to allow members of the French Gendarmerie to give evidence on how the intercepted data was obtained.

This has meant individuals facing charges outside France based on evidence from EncroChat or SkyECC have no legal recourse to challenge the legality of the French hacking operation.

Courts in the EU are obliged to accept the evidence provided by France under the “mutual recognition” principal that applies when one EU state supplies evidence to another under a European Investigation Order.

At the same time, people have been denied the right to challenge the evidence against them in the French courts, leaving people charged with offences based on the hacked phone data without legal recourse to appeal in any jurisdiction.

Decisions by the European Court of Justice and the European Court of Human Rights, expected this year, could end that anomaly.

In one case, the French Supreme Court – La Cour de cassation – has asked the Court of Justice to decide whether France’s refusal to allow non-French citizens to challenge the lawfulness of the French hacking operations in France contravenes EU law. According to La Cour de cassation, the decision is likely to have “significant consequences” for legal proceedings based on intercepted evidence in the EU.

In the second case, the European Court of Human Rights is expected to decide on a complaint from a German citizen, Murat Silgar, who was jailed for drug offences on the basis of EncroChat evidence.

Silgar argues that the German courts had used illegally obtained communications data and that technical details of the French retrieval of EncroChat data were not shared with him, in breach of the European Convention of Human Rights, which protects the right to a fair trial, and the right to private correspondence.

Justus Reisginer, a member of a coalition of defence lawyers known as the Joint Defence Team, told Computer Weekly the cases would address “a fundamental principle” in cross-border and digital investigations. “The law of the European Union requires that people have an effective remedy,” he said.

These are just a few of the battle lines between technology and privacy that will play out in 2026. For governments, the promise of a “technical fix” to deal with wider societal problems, such as child abuse and terrorism offences, is attractive. But history has shown that “technical fixes” rarely work, and often have unforeseen consequences.

Read more on IT risk management