Governments urged to step up enforcement of big tech amid rush to ban social media for under-16s
The Council of Europe’s Commissioner for Human Rights says that European governments should consider better enforcement against big tech companies before banning children from social media
European states should consider enforcing regulations against big tech companies before they consider banning children and teenagers from social media, according to the Council of Europe’s human rights spokesperson.
The commissioner for Human Rights, Michael O’Flaherty, said there is a danger that imposing minimum ages for social media could distract from the need to hold technology companies accountable for their behaviour.
His comments come as a growing number of countries, including the UK, Germany, Spain and Greece, are looking at age restrictions for social media platforms such as TikTok, Snapchat and Instagram.
O’Flaherty, an Irish human rights lawyer, said governments need to have a more nuanced discussion, including considering the human rights of children to freedom of expression, and stronger enforcement against tech companies, before banning young people from social media.
“It is clear that children need to be protected. There is no issue there,” he told Computer Weekly. But he said, rather than default to social media bans, countries need to take a more considered approach, including doing a “better job of policing the platforms and making them responsible and liable for their own practices”.
Although the European Union (EU) has introduced laws such as the Digital Rights Act, which allow countries to bring enforcement action against technology companies that fail to protect children, he argued that enforcement by member states has been inadequate.
“This is a good piece of legislation, but we need to see it vigorously enforced across each country where it applies. It’s not that we lack the law – we need to see the concerted, sustained delivery of the law that would make a difference,” he said. “We need to get all the member states up to the same level of vigorous enforcement.”
Algorithmic transparency
O’Flaherty said it was vital that big tech companies were transparent about how their algorithms work. Claims by tech companies that their algorithms are too complicated to understand are “unacceptable”, he said.
“We need algorithmic transparency so that what is dangerous can be detected. We need ongoing human rights compliance testing. We basically need an application of the tools that have been woven into the law,” he added.
We need algorithmic transparency so that what is dangerous can be detected. We need ongoing human rights compliance testing. We need an application of the tools that have been woven into the law
Michael O’Flaherty, Council of Europe
Obligations on tech companies must be enforceable, subject to independent oversight, and supported by sanctions and liabilities that are effective deterrents, he added. “The source of harm is rooted in the design and incentives of the platforms.”
Germany became the latest country to back a social media ban for children on Saturday, when the country’s ruling party passed a motion to ban its use by children under 14.
Spain announced plans to ban under-16s from social media in February, along with measures that will make company executives responsible for illegal or harmful content on their platforms. France backed similar measures in January. Denmark, Poland and Austria have also discussed social media bans for young people.
They follow Australia, which became the first country to introduce a social media ban for young people through its Online Safety Amendment Act, which came into force in December 2025. The act required platforms including TikTok, YouTube, X and Reddit to implement age-verification measures or face fines of millions of dollars.
Legal challenge to Australia
Reddit is challenging the ban in the Australian High Court, on the grounds that the law would damage people’s online privacy by forcing adults and minors to use intrusive and potentially insecure age verification services.
The UK’s prime minister, Keir Starmer, announced plans on 15 February to implement a minimum age for social media in a matter of months.
The proposals will restrict addictive features, such as endless scrolling or autoplay, for children on social media apps, and will limit children’s access to virtual private networks, which can be used to bypass age restrictions.
The European Commission has issued sanctions against large online platforms for breaches of the Digital Services Act.
In February, a preliminary ruling was made that TikTok’s addictive design features, including infinite scroll and autoplay, could lead to “compulsive use” for its users, causing them to behave on autopilot.
If TikTok refuses to make changes demanded by the European Commission to its algorithms, it could face a fine of up to 6% of its annual revenue, amounting to a potential fine of over £10bn.
O’Flaherty argued that, contrary to the claims of big tech companies, good regulation does not stifle innovation.
“Look at countries like Singapore, which are trying to put good regulation in place – they are among the most innovative places in the world when it comes to technology,” he added.
O’Flaherty said it was “preposterous” that social media companies have been allowed to self-regulate, unlike areas of the economy, such as car safety, where state regulation is seen as essential.
Read more about online safety
The UK’s Online Safety Act explained: In this essential guide, Computer Weekly looks at the UK’s implementation of the Online Safety Act, including controversies around age verification measures and the threat it poses to end-to-end encryption.
UK online safety regime ineffective on misinformation, MPs say: A report from the Commons Science, Innovation and Technology Committee outlines how the Online Safety Act fails to deal with the algorithmic amplification of ‘legal but harmful’ misinformation.