Rawpixel.com - Fotolia
UK children are unwittingly giving away rights to their private data and putting their online security at risk, according to children’s commissioner for England, Anne Longfield.
A year-long long study has revealed that youngsters are being “left to fend for themselves,” she said, with nearly 50% of eight to 11-year-olds agreeing to social media firms’ vague terms and conditions.
As a result, children are giving social media firms like Facebook control over their data without any accountability, according to the commissioner’s Growing Up Digital taskforce.
The taskforce found that none of the children in the focus group could fully understand the terms and conditions of photo-sharing service Instagram, which is used by more than half of 12 to 15-year-olds, and 48% of eight to 11-year-olds.
Only half of the eight to 11-year-olds could even read the terms, which ran to more than 5,000 words on 17 pages of text, the study found.
Longfield warned that “incomprehensible” terms and conditions of social networks mean children have little idea what they are signing up to, reports Sky News.
She said the internet was not developed with children in mind, but children are among its biggest users.
According to telecoms regulator Ofcom, three and four-year-olds spend eight and a quarter hours a week online, 12 to 15-year-olds spend more than 20 hours and 70% of them have a social media profile.
Read more about the GDPR
- With less than two years before the EU data protection rules come into force, there are 10 key areas businesses need to focus on to ensure they will be compliant.
- The European Parliament’s official publication of the General Data Protection Regulation means it will become enforceable on 25 May 2018.
- Companies that fail to start planning to deal with the EU’s data protection requirements are in for a real shock, warns the International Association of Information Technology Asset Managers.
- The GDPR is about enabling organisations to realise the benefits of the digital era, but it is serious about enforcement for those that do not play in the rules, says UK information commissioner.
Longfield has called on the government to add “digital citizenship” to the school curriculum for children from the age of four, and to provide a specialist ombudsman to represent the rights of children to social media companies and act as a mediator to help remove sensitive content.
“It is vital that children understand what they agree to when joining social media platforms, that their privacy is better protected, and they can have content posted about them removed quickly should they wish to,” she said.
“The internet has given children and young people fantastic opportunities, but protecting them from risks they might face online or on their phones is vital,” said a government spokesperson.
“The UK is a world leader in internet safety, but there is more to do, and we will carefully consider this report as part of our ongoing work to make the internet a safer place for children.”
The new European Union General Data Protection Regulation (GDPR) that becomes enforceable from 25 May 2018 requires technology companies to spell out how they use personal data.
The UK government is expected to implement the regulation, despite the UK’s departure from the EU.
Pam Cowburn, Open Rights Group communications director, said that social media firms generally do not make users aware of how information can be shared and sold on to advertisers.
Young people giving away personal information
“The situation is serious, as young people are unwittingly giving away personal information with no real understanding of who is holding that information, where they are holding it and what they are going to do with it,” said Jenny Afia, partner and privacy lawyer at Schillings.
She believes that the terms and conditions of social media sites should be rewritten in language that children can understand to make more informed decisions.
Affia said the simplified terms would have the same, if not a stronger legal standing as the current phrasing of the terms. “At the moment, there is a very good case to say a child is not giving informed consent,” she told the Guardian, which is one of the key requirements of the GDPR.
Nicola Fulford, head of data protection and privacy at Kemp Little, said that in a digital era, banning children from using the internet would be a draconian and ineffective strategy.
“Educating both parents and children on the risks and what steps can be taken to avoid or minimise them is key to keeping children safe online. This may include having conversations about what the internet is and the risks and dangers associated with being online, using the internet together and agreeing certain rules and limits for use – particularly for younger children.
“Some devices have ‘parental control’ functionality, which can be used to keep children safe, for example, by filtering, monitoring and reporting content. The EU’s GDPR will impose robust new obligations on device manufacturers and service providers, in addition to stronger penalties for non-compliance, which should offer comfort to many parents,” she told Computer Weekly.
Sarah Champion, Labour shadow cabinet minister for women and equalities, said the report is yet more evidence that the government is not doing enough to equip children to stay safe online.
“We have to recognise children are growing up immersed in a digital world. We owe it to them to do all we can to educate and support them to the risks they face in the virtual world, just as we do in the real world.”
Responding to the report, Instagram’s head of policy for Europe Michelle Napchan said that because Instagram began as a mobile app, it has always prioritised giving people easy to understand, clear information about its safety and privacy policies, which can be accessed from mobile devices.
“We provide multiple ways for our community to find the information and resources they need. We recognise in many cases, when people need help, they want it when they’re using the app.
“That is why we go beyond our terms and guidelines to offer in-app safety and privacy help - from reporting, to industry-leading comment tools and self-help resources.
“We have also produced a guide for parents to help them talk to their teenagers about internet safety,” she said in a statement.
Instagram is aimed at users who are at least 13 years old and has provided an online form to enable people to report accounts that are being used by anyone younger than that.
The photo and video sharing service also provides in-app reporting facilities for anyone who see content that makes them feel uncomfortable. ..... ....... ..... ........