prima91 - stock.adobe.com

TikTok fined €345m under GDPR for failing to protect children’s privacy

Data protection regulators warn social media companies to take all necessary measures to protect children’s privacy

TikTok, the video-sharing social medial platform, is to be fined €345m for failing to protect children’s privacy.

Ireland’s Data Protection Commissioner (DPC) announced the fine against the video-sharing app for breaches of the GDPR data protection law.

TikTok exposed children aged 13 to 17 to risks by making their accounts publicly available by default and allowing them to pair their accounts with adults who may not be family members, the regulator found.

The Chinese-owned company has been given three months to rectify the breaches following a reprimand from the regulator. The fine follows an investigation initiated by the Irish data protection regulator into TikTok Technology Limited’s handling of children’s data and age-verification procedures in September 2021.

The investigation assessed TikTok’s compliance with its obligations under GDPR for processing personal data relating to children between July 2020 and December 2020.

The DPC found in a decision published today (15 September 2023) that TikTok exposed children to risks by setting their accounts to public by default, which meant that anyone could view videos posted by children. The regulator also found that user comments and the platforms ‘duet’ and ‘stitch’ features were publicly available by default, creating further risks.

TikTok, which is owned by Beijing base ByteDance, also made it possible for children’s accounts to be paired with adult TikTok users without checking that the adult in question was the child’s parent or guardian. This meant that adults could sent direct messages to children over the age of 16, which posed potentially serious risks to children using the platform.

TikTok, which opened an office in Dublin in 2020, imposes a minimum age limit for children of 13 years old, which is enforced by requiring users to enter their date of birth when registering. However, the DPC found that children under the age of 13 who managed to gain access to the platform were also exposed to potential risks because their accounts were made public by default.

The DPC did not find that TikTok’s age-verification process was in breach of GDPR.

TikTok disputes fine

TikTok said it disagreed with the fine and that it had fixed many of the issues in the complaint before the Irish DPC began its investigation.

“We respectfully disagree with the decision, particularly the level of the fine imposed,” a TikTok spokesperson said.

“The DPC’s criticisms are focused on features and settings that were in place three years ago, and that we made changes to well before the investigation even began, such as setting all under 16 accounts to private by default,” the spokesperson added.

Elaine Fox, head of privacy for Europe at TikTok, said in a statement that most of the DPC’s criticisms were no longer relevant, following measures introduced by the platform in early 2021.

These included making all accounts for users aged between 13 and 15 private by default, removing the option to allow anyone to make comments on videos posted by children, and making it easier for children to understand privacy settings.

The DPC filed a draft decision against TikTok with other data protection supervisors in Europe in September 2022. The case went to the European Data Protection Board (EDPB), an umbrella group for regulators, for adjudication, after data protection regulators in Italy and Berlin raised objections.

The Berlin regulator pushed for an additional infringement under the GDPR principle of fairness over TikTok’s use of “dark patterns” to nudge users to choose more privacy-intrusive options during the registration process and when posting videos.

Italy unsuccessfully called for the DPC to reverse the DPC’s finding that TikTok’s age-verification process was complaint with GDPR, but the EDPB accepted the Berlin regulator’s complaint that TikTok had nudged users to select options with less privacy protection.

Anu Talus, EDPB chair, said that today’s decision made it clear that digital companies had to take all necessary measures to protect children’s data protection rights.

“Social media companies have a responsibility to avoid presenting choices to users, especially children, in an unfair manor – particularly if that presentation can nudge people into making decisions that violate their privacy interests,” she added.

Read more about TikTok and privacy

Read more on IT risk management

CIO
Security
Networking
Data Center
Data Management
Close