yanlev - Fotolia

What the EU’s content-filtering rules could mean for UK tech

EU proposals to clamp down on child sexual abuse material will have a material impact on the UK’s technology sector

On 11 May 2022, the European Commission released a proposal for a regulation for laying down rules to prevent and combat child sexual abuse. The regulation would establish preventative measures against child sexual abuse material (CSAM) being distributed online.

Although the UK is no longer part of the European Union (EU), any UK companies wishing to operate within the world’s largest trading bloc will need to abide by EU standards. As such, this regulation would have an enormous impact on online communications services and platforms in the UK and around the world.

Some online platforms already detect, report and remove online CSAM. However, such measures vary between providers and the EU has decided that voluntary action alone is insufficient. Some EU member states have proposed or adopted their own legislation to tackle online CSAM, but this could fragment the EU’s vision of a united Digital Single Market.

This is not first time that content scanning has been attempted. In 2021, Apple proposed scanning owners’ devices for CSAM using client-side scanning (CSS). This would allow CSAM filtering to be conducted without breaching end-to-end encryption. However, the backlash against this proposal led the idea being postponed indefinitely.

At its core, the EU regulation will require “relevant information society services” to enact the following measures (Article 1):

  • Minimise the risk that their services are misused for online child sexual abuse.
  • Detect and report online child sexual abuse.
  • Remove or disable access to child sexual abuse material on their services.

Article 2 describes “relevant information society services” as any of the following:

  • Online hosting service – a hosting service that consists of the storage of information provided by, and at the request of, a recipient of the service.
  • Interpersonal communications service – a service that enables direct interpersonal and interactive exchange of information via electronic communications networks between a finite number of persons, whereby the persons initiating or participating in the communication determine its recipient(s), including those provided as an ancillary feature that is intrinsically linked to another service.
  • Software applications stores – online intermediation services, which are focused on software applications as the intermediated product or service.
  • Internet access services – publicly available electronic communications service that provides access to the internet, and thereby connectivity to virtually all end-points of the internet, irrespective of the network technology and terminal equipment used.

The regulation would establish the EU Centre to create and maintain databases of indicators of online CSAM. This database would be used by information society services in order to comply with the regulation. The EU Centre would also act as a liaison to Europol, by first filtering any reports of CSAM that are unfounded – “Where it is immediately evident, without any substantive legal or factual analysis, that the reported activities do not constitute online child sexual abuse” – and then forwarding the others to Europol for further investigation and analysis.

Fundamental rights

A major concern about this regulation is that the content filtering of private messages would impinge on users’ rights to privacy and freedom of expression. The regulation does not merely propose scanning the metadata of messages, but the content of all messages for any offending material. “The European Court of Justice has made it clear, time and time again, that a mass surveillance of private communications is unlawful and incompatible with fundamental rights,” says Felix Reda, an expert in copyright and freedom of communication for Gesellschaft für Freiheitsrechte.

These concerns are acknowledged in the proposed regulation, which states: “The measures contained in the proposal affect, in the first place, the exercise of the fundamental rights of the users of the services at issue. Those rights include, in particular, the fundamental rights to respect for privacy (including confidentiality of communications, as part of the broader right to respect for private and family life), to protection of personal data and to freedom of expression and information.”

However, the proposed regulation also considers that none of these rights should be absolute. It states: “In all actions relating to children, whether taken by public authorities or private institutions, the child’s best interests must be a primary consideration.”

There is also the issue of the potential erroneous removal of material – due to the mistaken assumption that said material concerns child sexual abuse material – which can have significant impact on a user’s fundamental rights of freedom of expression and access to information.

Enacting the regulation

Article 10 (1) of the proposed regulation states: “Providers of hosting services and providers of interpersonal communication services that have received a detection order shall execute it by installing and operating technologies to detect the dissemination of known or new child sexual abuse material or the solicitation of children, as applicable.”

However, unlike previous regulations, the necessary technical measures for establishing how online platforms can meet the requirements are not outlined in the proposed regulation. Instead, it gives platforms and providers flexibility in how they implement these measures, so the regulatory obligations can be embedded effectively within each service.

“You notice in the introduction that it doesn’t necessarily well define what a provider is and it doesn’t necessarily define how well one has to scan things,” says Jon Geater, CTO of RKVST.

According to Section 10 (3), once a detection order has been issued, the content filters will be expected to meet these criteria:

  • Detecting the dissemination of known or new CSAM or the solicitation of children.
  • Not extract any information, other what is necessary for the purposes of detection.
  • In accordance with the state of the art in the industry and the least intrusive in terms of the impact on the users’ rights to private and family life.
  • Sufficiently reliable, such that they minimise false positives.

But in order to detect CSAM or solicitation of children, content scanning of every communication would be required. The current proposal does not define what is considered to be a “sufficiently reliable” benchmark for minimal false positives. “It’s not feasible for us or anybody else to be 100% effective, and it’s probably not very sensible for everybody to try their own attempt at doing it,” says Geater.

To help businesses meet these new regulatory obligations, the EU Centre will offer detection technologies free of charge. These will be intended for the sole purpose of executing the detection orders. This is explained in Article 50 (1), which states: “The EU Centre shall make available technologies that providers of hosting services and providers of interpersonal communications services may acquire, install and operate, free of charge, where relevant subject to reasonable licensing conditions, to execute detection orders in accordance with Article 10(1).”

Should a provider or platform choose to develop their own detection systems, Article 10 (2) states: “The provider shall not be required to use any specific technology, including those made available by the EU Centre, as long as the requirements set out in this Article are met.”

Although these detection technologies will be freely offered, the regulation nonetheless places huge demands on social media providers and communication platforms. Providers will be required to ensure human oversight, through analysing anonymised representative data samples. “We view this as a very specialist area, so we have a third-party supplier who provides scanning tools,” says Geater.

According to Article 24 (1), any technology company that comes under the purview of  “relevant information society services” operating within the EU will require a legal representative within one of the EU’s member states. At the very least, this could be a team of solicitors as the point of contact.

Any platform or service provider that fails to comply with this regulation will face penalties of up to 6% of its annual income or global turnover. Supplying incorrect, incomplete or misleading information, as well as failing to revise said information, will result in penalties of up 1% of annual income or global turnover. Any periodic penalty payments could be up to 5% of average daily global turnover.

Concerns remain

One aspect that is particularly concerning is that there are no exemptions for different types of communication. Legal, financial and medical information that is shared online within the EU will be subject to scanning, which could lead to confidentiality and security issues.

In October 2021, a report into CSS by a group of experts, including Ross Anderson, professor at the University of Cambridge, was published on the open-access website arXiv. The report concluded: “It is unclear whether CSS systems can be deployed in a secure manner such that invasions of privacy can be considered proportional. More importantly, it is unlikely that any technical measure can resolve this dilemma while also working at scale.”

Ultimately, the regulation will place significant demands on social media platforms and internet-based communication services. It will especially impact smaller companies that do not have the necessary resources or expertise to accommodate these new regulatory requirements.

Although service providers and platforms could choose not to operate within EU countries, thus negating these requirements, this approach is likely to be self-destructive because of the massive limitation in userbase. It would also raise ethical questions if a company were seen to be avoiding the issue of CSAM being distributed on its platform. It is also likely that similar legislation could be put in place elsewhere, especially for any country wishing to harmonise its legislation with the EU.

It would therefore be prudent to mitigate the impact of this proposed regulation by preparing for the expected obligations and having the appropriate policies and resources in place, enabling businesses to swiftly adapt to this new regulatory environment and manage the financial impact.

Read more on Privacy and data protection

CIO
Security
Networking
Data Center
Data Management
Close