TikTok Receives Significant GDPR Fine for Mishandling Children's Data

TikTok Receives Significant GDPR Fine for Mishandling Children's Data

Introduction

In a landmark decision, the Irish Data Protection Commission (DPC) issued a €345 million fine to TikTok Technology Limited (‘TikTok’) in September 2023. This hefty penalty was levied on the grounds that TikTok breached GDPR requirements, in particular, with respect to children’s data. The decision comes after TikTok was fined £12.7m in April by the UK’s Information Commissioner for unlawfully processing personal data of around 1.4 million child users under the age of 13. We covered this fine in our May 2023 issue, which you can read here.

As for the DPC, the decision follows a piercing two-year investigation into the company’s privacy policies and practices, covering a period from 31 July to 31 December 2020. This inquiry looked into the following areas:
  • Platform settings for child users, including the problematic 'Family Pairing' feature;
  • Age verification; and
  • Transparency information for child users.
Upon concluding its inquiry, the DPC submitted a draft decision to the concerned supervisory authorities, subsequent to which the European Data Protection Board (EDPB) delivered a 126-page long binding decision.

Key Findings

The DPC made the following key findings:
  • By default, the profile settings for child user accounts were configured as public, enabling anyone, regardless of whether they were TikTok users or not, to access the content posted by child users. Although children had the option to make their accounts private during setup, they were also given the choice to 'skip' this step. This practice ran foul of the principles of data minimisation, data protection by design, and data protection by default. The breach was also particularly relevant to children under the age of 13 who had gained access to the platform.
  • The family pairing setting allowed a non-child user (who could not be verified as parent or guardian) to pair their account to a child user’s account, which meant that the non-child user could send direct messages to the child. The DPC found that this “posed severe possible risks to child users”, and that the feature violated the security principle and data protection by design. 
  • TikTok failed to provide sufficient transparency information to child users. One of the elements that the regulator focused on in the decision was the language used to offer privacy information in the privacy notice. EDPB stated that references to terms such as “public”, “anyone” and “everyone” are ambiguous in that they can be understood as referring to both registered and non-registered users, and that the wording didn’t allow for that distinction to be made clearly.
  • TikTok implemented so-called “Dark Patterns” by nudging users towards choosing more privacy-intrusive options during the registration process and when posting videos.  For context, the EDPB defines “dark patterns” as interfaces and user experiences on social media platforms that nudge users into making unintended, unwilling and potentially harmful decisions with respect to their personal data (in other words, influencing their behaviour). This was found to not be in line with the principle of fairness under GDPR. Notably, the EDPB has issued extensive guidance on the subject of “dark patterns”, which can be found here.
In light of the above, the DPC ordered TikTok to bring its processing of child user information into compliance within three months and issued the fine. The DPC also issued a reprimand with respect to some of the findings. 

The spokesperson for the company stated that they “respectfully disagree with the decision, particularly the level of the fine imposed.” They emphasised that the criticised features and settings dated back three years and had been addressed before the investigation began. 
 

Why is this decision relevant?

The Irish DPC's unprecedented €345 million fine against TikTok underscores the growing importance of data privacy regulations, especially concerning children. In our view, the key takeaways of this decision are as follows:
  • Implementing the principles of privacy by design and default is key to complying with GDPR. It means that the data protection considerations must be baked into the systems and platforms that your organisation develops and uses to deliver services. This also requires that, by default, you offer the highest degree of data protection settings to individuals. For example, profiles must be set to “private” instead of “public”. As the DPC suggests, the mere fact of having this setting may not be enough if you offer users an option to “skip” the default settings.
  • When providing transparency information (usually in the form of a privacy notice), you must make sure that you are as clear and succinct as possible about your processing activities. Your wording must leave no room for ambiguity or misunderstanding.
  • It’s important to make sure that you allow users to make decisions with respect to their personal data without improper influence, especially where consent is used as a lawful basis. GDPR requires organisations to comply with the principles of fairness, which amongst others means that you must not use unfair practices that are not in line with the individuals’ expectations. You must avoid the so-called “dark patterns” to nudge users towards options that may be more intrusive to their privacy. Failing this, you may be found to be in breach of GDPR’s requirements.
  • When taking steps to comply with GDPR, you must be aware of the particular circumstances of your data subjects, which may require you to take additional measures to make your products and services more user-friendly. We recently discussed the ICO’s Children’s Code, which could be a helpful resource when designing products and services that involve the collection and processing of child user information.
If you have any queries or would like further information, please visit our data protection services section or contact Christopher Beveridge.

SUBSCRIBE: DATA PRIVACY INSIGHTS

Subscribe: Data Privacy Insights