Daniel - stock.adobe.com
The Online Safety Bill (OSB) “is not clear and robust enough” to deal with various types of harmful but legal content, a parliamentary committee report has claimed.
Published on 24 January 2022, the Digital, Culture, Media and Sport (DCMS) select committee report urged the government to proactively address types of content that are technically legal, but that are specifically designed to evade or subvert content moderation efforts.
As it currently stands, the draft OSB would impose a statutory “duty of care” on technology companies that host user-generated content or allow people to communicate, meaning they would be legally obliged to proactively identify, remove and limit the spread of both illegal and legal but harmful content, such as child sexual abuse, terrorism and suicide material.
Failure to do so could result in fines of up to 10% of their turnover by the online harms regulator, which was confirmed in December 2020 to be Ofcom.
“Though the Bill takes several welcome steps to address harms to children, such as duties to tackle child sexual exploitation and abuse (CSEA), we are concerned about the prevalence of content and activity that are seemingly not captured by the proposed regime,” it said.
“One example of such activity is breadcrumbing, where perpetrators deliberately subvert the thresholds of criminal activity and for content removal by a service provider.
“We recommend that the government respond to our concerns about the risk of content and activity that falls below the threshold of outright criminal activity but nonetheless forms part of the sequence for online CSEA. One starting point should be to reframe the definition of illegal content to explicitly add the need to consider context as a factor, and explicitly include definitions of activity like breadcrumbing, on the face of the Bill.”
Read more about online safety
- Digital secretary commits to establishing ongoing oversight of the Online Safety Bill and its implementation, and suggests the grace period on criminal liability for tech company execs should be shortened from two years to a maximum of six months once it’s enacted.
- Campaign group set up to oppose Online Safety Bill says the duty of care is too simplistic, cedes too much power to US corporations and will, in practice, privilege the speech of journalists or politicians.
- Data protection watchdog warns that delaying end-to-end encryption will put children at risk.
While the report itself contains very little information on how to detect or stop breadcrumbing, it recommended that the government work with “charities, campaign organisations and children’s advocacy groups to identify, define and address” such content.
According to a recent report by the National Society for the Prevention of Cruelty to Children (NSPCC), “breadcrumbs” are designed to facilitate access to child sexual abuse material, and allow abusers to identify one another and form networks. “In many cases, abusers will upload or seek to access material containing large numbers of images taken in the run-up to or following sexual abuse, effectively forming part of a sequence that culminates with images or videos that meet the criminal threshold,” it said.
“In some cases, these are deliberately used by abusers because they anticipate such images won’t be proactively removed by the host site.”
The DCMS committee report further added that other types of violence against women and girls – such as tech-enabled “nudifying” and deep-fake pornography – should also be brought into the OSB’s scope of harm. The MPs said this should be done either through primary legislation or by including them as types of harmful content in the OSB’s duty of care.
The committee also recommended that the definition of harmful content should be expanded to explicitly include any content that “undermines, or risks undermining, the rights or reputation of others, national security, public order, and public health or morals.”
It added that Ofcom should also have the power to conduct audits of tech companies’ systems, which would include the compulsory disclosure of data sets and performance metrics (including how these are used to measures success).
This is in line with previous comments from former information commissioner Elizabeth Denham, who told MPs and peers in September 2021 that she would like to see Ofcom’s information-gathering powers “bolstered by [compulsory] audit powers” so it can properly “look under the bonnet” of tech firms’ algorithms.
In December 2021, the joint parliamentary committee for the draft OSB – which was set up to scrutinise the legislation and propose improvements – published its own 193-page report following a five-month inquiry, in which it recommended that Parliament should provide more clarity around the overarching duty of care through specific duties that explicitly set out what service providers should be doing to prevent harm online.
“We recommend the Bill is restructured,” it said. “It should set out its core objectives clearly at the beginning. This will ensure clarity to users and regulators about what the Bill is trying to achieve and inform the detailed duties set out later in the legislation.
“Everything flows from these: the requirement for Ofcom to meet those objectives, its power to produce mandatory codes of practice and minimum quality standards for risk assessments in order to do so, and the requirements on service providers to address and mitigate reasonably foreseeable risks, follow those codes of practice and meet those minimum standards.”
On the basis that major platform providers do not have adequate governance structures in place – as evidenced by the testimony of Facebook’s global head of safety, Antigone Davis, and others in November 2021 – the report also recommended extending criminal liability in the draft bill, which currently limits Ofcom to only taking action against senior managers on a failure to supply information.
“We recommend that a senior manager at board level or reporting to the board should be designated the “safety controller” and made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users,” it said.
“We believe that this would be a proportionate last resort for the regulator – like any offence, it should only be initiated and provable at the end of an exhaustive legal process.”
While the DCMS committee report does recommend that service providers should be required to designate dedicated compliance officers, it only makes passing reference to concerns around criminal liability for senior managers, and simply stated that greater clarity was needed from the government around the OSB’s enforcement powers.
In response to the DCMS committee report, a DCMS spokesperson said the department did not agree with the criticisms put forward by the MPs: “The Bill has been recognised as setting a global gold standard for internet safety,” they said. “It has strict measures including a duty of care to stamp out child sexual abuse, grooming and illegal and harmful content.
“There are also stringent rules to make sure tech firms and Ofcom protect people’s free speech and privacy, so content is not taken down without good reason. The Bill will make the UK the safest place to go online while protecting freedom of speech. ”