Carsten Reisinger - stock.adobe.
The UK’s tech industry is booming. It’s three million people strong and contributes a staggering £149bn to our economy every year. Last year it hit new heights attracting £10.1bn of inward investment, more than any other country in Europe.
Our tech companies have incredible access to finance and benefit hugely from our strong regulatory regimes for science, R&D and intellectual property. The OECD ranks Britain as one of the best places to start a tech business in the world and has given us the highest overall score for the quality of our regulatory practices.
Internet-based technologies, with the freedom, openness and convenience they bring, are a tool to be harnessed and there is no better place to make them than here in the UK. It is difficult to imagine how much harder responding to the pandemic would have been without them. But like any tool they can help or they can harm.
Tech platforms have been utilised in some of the most despicable events in recent history - from the live streaming of the atrocious Christchurch terrorist attack to the rise of online grooming. Over one month during lockdown, at least 8.8 million attempts by UK internet users to access child abuse material were blocked.
Our tech firms have the government’s full commitment to helping them thrive. But for this to happen, the people who use them need to trust that they’re safe.
The government has set out new rules of the road for internet companies to address some of the unintended consequences of the digital revolution. Our response to the Online Harms White Paper outlines our final decisions on new regulations for internet services that are accessible in the UK and which enable people to interact with each other or share their own content such as comments, images and videos.
We have taken the time to get this regulatory regime right - it is pro-competition, pro-innovation and proportionate. Fewer than 3% of UK businesses will be in scope and it is focused on the biggest platforms where the risk of harm being done is greatest.
All companies will be legally required to take steps to stop the spread of what is already criminal - the vile content showing child sexual abuse or inciting others to terrorism. And they will need to protect children using their services from being exposed to harmful or inappropriate content or contact.
The most popular social media sites will have extra responsibilities to tackle legal but nevertheless harmful activity - such as dangerous disinformation and bullying. We will help these companies regain the trust of the public by holding them to the promises they make in their terms and conditions.
Avoid unnecessary burdens
We have taken great care to avoid creating unnecessary burdens and have built exemptions in for low-risk firms. For example, online retailers that only allow users to leave reviews will be exempt, as will services used by organisations for internal purposes like team collaboration.
Ofcom, as the independent regulator of this new system, will have a legal duty not to unfairly burden small and medium-sized enterprises. Its approach with individual companies will reflect their size, capacity and the severity of the harm in question. It will help all companies understand and fulfil their new responsibilities, by issuing codes of practice to follow.
Next year we will bring forward the legislation to implement this carefully designed regulation that is good news for UK internet users, but it’s good for our businesses too.
We all have a shared responsibility for making online spaces safer, so we’re giving tech firms much-needed clarity on how they do this. This will give confidence to innovators and investors, build trust among consumers, and the UK will continue to cement its reputation as a global tech powerhouse.
Read more about online safety
- The UK government should introduce a compulsory “news bargaining code” to force digital platforms to pay news publishers for the right to use their content, says the House of Lords Communications and Digital Committee.
- The government has told social media companies that they need to go “further and faster to address disinformation” about the Covid-19 coronavirus pandemic.
- Many of the regulatory bodies overseeing algorithmic systems and the use of data in the UK economy will need to build up their digital skills, capacity and expertise as the influence of artificial intelligence and data increases, MPs have been told.