goodluz - Fotolia

ICO code sets out digital privacy standards for children

The Information Commissioner’s Office has published its Age Appropriate Design Code, a set of 15 standards that online platforms must meet to protect the privacy of younger users

Digital services will be forced to strengthen their privacy settings for users under the age of 18 under the Information Commissioner’s Office’s (ICO) Age Appropriate Design Code, which sets out new standards expected of online social media platforms, online educational, gaming and streaming services, makers of connected toys, developers of smartphone apps, and others.

Described as a set of flexible standards that neither ban nor prescribe, the code will help such services automatically provide children with a built-in, enhanced level of data protection, with privacy settings set to high by default, and nudge techniques to encourage children to weaken their settings strongly discouraged.

It also asks digital services to turn off location services by default, to minimise the collection and sharing of personal data, and to end profiling that can allow children to be served targeted advertising content.

“Personal data often drives the content that our children are exposed to – what they like, what they search for, when they log on and off, and even how they are feeling,” said the information commissioner, Elizabeth Denham.

“In an age when children learn how to use an iPad before they ride a bike, it is right that organisations designing and developing online services do so with the best interests of children in mind. Children’s privacy must not be traded in the chase for profit.”

“There are laws to protect children in the real world – film ratings, car seats, age restrictions on drinking and smoking. We need our laws to protect children in the digital world too”
Elizabeth Denham, information commissioner

All told, the code lays down 15 standards that digital services should meet, with the starting point that the best interests of children should be a primary consideration when designing and developing any online service. It also includes practical guidance on data protection safeguards to help ensure online services are age appropriate.

Developed alongside campaigners, trade bodies and industry and sector representatives, the code is rooted in the United Nations Convention on the Rights of the Child, and is the first of its kind to be introduced anywhere in the world, although it reflects a similar direction of travel in Europe and the US.

The standards are rooted in the General Data Protection Regulation (GDPR) and were introduced by the Data Protection Act of 2018. The code itself was submitted to the government in November and will now move through a statutory process before being laid before Parliament. Under the UK’s continuing obligations as a European Union (EU) member, it must also notify the European Commission and observe a resultant three-month standstill period.

Subject to a 12-month window to give organisations time to update their practices and services, the code is expected to come into full effect towards the end of 2021. The ICO said it would be engaging with digital organisations to help them through the process.

“One in five internet users in the UK is a child, but they are using an internet that was not designed for them,” added Denham. “There are laws to protect children in the real world – film ratings, car seats, age restrictions on drinking and smoking. We need our laws to protect children in the digital world too.”

“We must teach children why it’s important to protect their data, as telling them something is dangerous is one thing, but explaining why and how to navigate this risk is much more effective”
Tom Chivers, ProPrivacy

Tom Chivers, a digital privacy advocate at ProPrivacy, said the code was a long-overdue step in the right direction for digital privacy. “Government regulation is essential when it comes to the laws that bind the digital world. All too often, ‘big tech’ have proven that they are either unwilling, or incapable of self-regulating. When it comes to the safety of children, we can’t afford to not take a firm stance on their digital rights,” he said.

“Ms Denham raises an excellent point when she says children in today’s society learn how to use an iPad before they ride a bike. To borrow from her example, ‘training wheels’ play a big part in how we learn to ride a bike, and I think this too applies to the internet. 

“While these codes are a step in the right direction, these ‘training wheels’ should come in the form of education. We must teach children why it’s important to protect their data – much like we teach them to look both ways when crossing a road – as telling them something is dangerous is one thing, but explaining why and how to navigate this risk is much more effective,” said Chivers.

Read more about online safety

Read more on Privacy and data protection

CIO
Security
Networking
Data Center
Data Management
Close