sdecoret - stock.adobe.com
Department of Health pushes online regulation agenda
Tech giants commit to initiative to remove harmful content from their websites but MPs call for further changes in algorithms
The Department of Health and Social Care (DHSC) is pushing its agenda within the government’s ambitions to hold technology companies accountable for the impact of their online offerings.
Tech giants have agreed to contribute to a fund aimed at tackling harmful online content following a summit on social media in health on Monday 29 April.
Outcomes from this year’s second summit with social media firms on how they could act to protect vulnerable people online include a commitment from companies such as Facebook, Instagram, Twitter, Pinterest and Google to help fund a new initiative to tackle self-harm content on their sites.
Health secretary Matt Hancock said the companies will work with the Samaritans charity to speed up identification and removal of content related to suicide and self-harm and create greater protections online.
“This partnership marks, for the first time globally, a collective commitment to act, to build knowledge through research and insights, and to implement real changes that, ultimately, will save lives,” said Hancock in a statement in the Commons.
The group will report on progress in two months’ time, with companies expected to take further action and the Samaritans being able to identify harmful content more clearly, Hancock added.
Wikipedia has failed to attend both meetings and was accused by Hancock of “shirking their social responsibilities”, adding that the internet giant will be written to in “robust terms”.
Amazon was also cited during the session at the Commons, with Tory MP Rachel Maclean challenging Hancock over how the tech giant is selling books that “normalise” anorexia as a healthy lifestyle.
Hancock said he would investigate the matter and that Amazon “has a duty of care” to consumers who buy its physical goods “in the same way that a shop has a responsibility for what it sells”.
Tackling the spread of anti-vaccination information on social media platforms was another topic discussed in the summit, said Hancock. “The rise of social media now makes it easier to spread lies about vaccination, so there is a special responsibility on social media companies to act,” he said.
Since the first summit with social media companies in February, the health secretary noted that some advances have been made towards improving online safety, such as Instagram’s global policy of removing all graphic self-harm imagery.
“I am glad that some of the global algorithms and global terms and conditions have been changed as a result of action taken by the UK government,” said Hancock.
Read more about online regulation
- UK introduces world’s first online safety regulations.
- Internet Watch Foundation raises stakes in war against online child porn.
- UK government to introduce online porn age checks.
But shadow health secretary Jonathan Ashworth observed that his own Instagram searches had found hundreds of thousands of images and videos of graphic self-harm in addition to pro-anorexia posts and the normalisation of eating disorders.
Ashworth argued that “dangerous content should be blocked and taken down” and pressed Hancock to know when the proposed legislation on online harms will come before the Commons, as well as when the new regulator and duty of care will be enforced.
He also argued that Hancock should be instructing Public Health England to launch an online social media campaign to tackle anti-vaccination propaganda.
Hancock did not respond to Ashworth’s observations about the the amount of harmful content still available online, but was further pressed on the matter by Yvette Cooper, Labour chair of the Home Affairs Committee, who said it was time to do much more about algorithms that “push people into more extreme behaviour”.
Hancock said: “Graphic self-harm imagery will be taken down from Instagram. Our challenge is to make sure that this is done properly, because ultimately, only if social media companies change their algorithms can we make this happen. That is why the new regulator is so important.”
In response to Ashworth’s demand for timescales, Hancock said the government is in the middle of a 12-week consultation around the online harms whitepaper and wants to consult widely before deciding on how to govern the internet.
“I have to say that the tone from the social media companies has changed in recent months and years, but they still need to do an awful lot,” said Hancock.