fizkes - stock.adobe.com
UK safety tech sector sees strong revenue and employment growth
Safety tech is now one of the fastest-growing sectors in the UK tech industry, with jumps in revenue, investment and employment
UK “safety tech” companies saw a revenue increase of 21% over the past year to reach £381m, making it one of the fastest-growing sectors of the UK tech industry, according to a government-backed sectoral analysis.
Conducted by Perspective Economics on behalf of the Department for Digital, Culture, Media and Sport (DCMS), the UK safety tech sector: 2022 analysis found that the number of companies delivering safety tech products and services grew by 17% to 117, with jobs in the sector increasing by 30% to total 2,850.
Of the 117 safety tech companies identified by the analysis, 57% are based outside London and the South East, with hotspots in Leeds and Edinburgh, and emerging centres of activity in Greater Manchester, Oxford, Bristol and Belfast.
“There are encouraging signs of newly registered startups, particularly in the domains of tackling disinformation and developing new approaches to content moderation and video and image analysis at scale,” said the analysis. “Existing firms specialising in communication and semantics are also moving into the safety tech domain.
“As a result of considerable growth within UK safety tech, there has been sustained interest from the investment community. It remains likely that the UK could see its first safety tech unicorn [a company worth over $1bn] emerge in the coming years.”
The report said total external investment into UK safety tech reached £63m cross 16 deals in 2021, although about half of this (£30m) went to Faculty, an artificial intelligence (AI) company that has previously worked on the controversial NHS Covid-19 datastore. It is run by Mark Warner, who also advised on the data operation for the Dominic Cummings-led Vote Leave EU referendum campaign.
Other investments include £15.2m raised by SafeToNet, which uses AI and analytics to help safeguard children by detecting threats; £5.8m raised by Vault Platform, a workplace misconduct reporting app; and £5m raised by Cyacomb, which is working with law enforcement agencies across the world to detect child sexual abuse material (CSAM).
“Making the online world safer is not only the right thing to do, it’s good for business,” said digital minister Damian Collins. “UK tech firms are at the cutting edge developing practical solutions to the risks posed by the internet so that it continues to be a benefit, not a detriment, to people’s lives.
“They have blazed a trail of growth, innovation and job creation to become world leaders in their field and we are committed to maintaining their upward trajectory.”
According to a separate Ipsos report released on 27 July, Trust, safety and the digital economy, companies that adopt online safety technology also experience greater brand trust, higher user engagement and better staff and customer retention.
“I am delighted to see that the Safety tech sector analysis confirms there has been strong growth in sales, employment and investment in the sector and it remains one of the fastest growing parts of the economy,” said Ian Stevenson, chair of the Online Safety Tech Industry Association (OSTIA).
“The new Trust, safety and the digital economy report highlights how platforms benefit from safety tech by creating healthier and more resilient online environments, which have commercial value as well as benefiting their users. These new analyses will help OSTIA members and others in the sector continue these positive trends.”
Read more about online safety and safety tech
- The UK government can’t legislate the impossible – a safer society depends on encryption, not breaking it.
- Ian Levy, technical director of the NCSC, and Crispin Robinson, technical director of GCHQ, back client-side scanning software on mobile phones to detect child abuse.
- Online harms regulator Ofcom has published an Online Safety Roadmap, provisionally setting out its plans to implement the UK’s forthcoming internet safety regime.
The sectoral analysis and Ipsos report come in the wake of the UK government pausing the passage of the Online Safety Bill after legislative timetabling issues meant Parliament was unable to push the bill through before the summer recess.
In his foreword to the DCMS report, Collins said the government would continue to fund the safety tech sector, and noted: “The Safety Tech Challenge Fund has already supported the development of innovative technologies to help keep children safe in end-to-end encrypted [E2EE] environments, while upholding user privacy.”
Three companies working on a Safety Tech Challenge Fund project to detect CSAM before it reaches encrypted environments told Computer Weekly in January 2022 that pre-encryption scans for such content – also known as client-side scanning – can be carried out without compromising privacy.
Although Apple attempted to introduce client-side scanning technology – known as Neural Hash – to detect known child sexual abuse images on iPhones last year, the plans were put on indefinite hold after an outcry by tech experts.
A report by 15 leading computer scientists, Bugs in our pockets: the risks of client-side scanning, published by Columbia University, identified multiple ways that states, malicious actors and abusers could turn the technology around to cause harm to others or society.
“Client-side scanning, by its nature, creates serious security and privacy risks for all society, while the assistance it can provide for law enforcement is at best problematic,” the scientists said. “There are multiple ways in which client-side scanning can fail, can be evaded and can be abused.”
Speaking with Computer Weekly in November 2021 after the announcement of the Challenge Fund winners, then-digital minister Chris Philp said the government would not mandate any scanning that goes beyond the scope of uncovering child abuse material.
“These technologies are CSAM-specific,” he said. “I met with the companies two days ago and with all of these technologies, it’s about scanning images and identifying them as either being previously identified CSAM images or first-generation created new ones – that is the only capability inherent in these technologies.”
Asked whether there was any capability to scan for any other types of image or content in messages, Philp added: “They’re not designed to do that. They’d need to be repurposed for that, as that’s not how they’ve been designed or set up. They’re specific CSAM scanning technologies.”