Children’s Code

Children’s Code

The Information Commissioner's Office (ICO) has published the ‘Age Appropriate Design: A Code of Practice for Online Services’ (the Children’s Code) to help organisations safeguard children's personal data online. It applies to ‘information society services likely to be accessed by children in the UK and includes a wide variety of products such as apps, connected toys and devices, search engines, social media platforms, etc. (whether or not these services are specifically directed at children). While this is not a recent development, given the recent focus on children’s data both in the UK and the EU, we consider it important to remind of its significance.

The Children's Code was established as part of the UK Data Protection Act 2018 with the goal of providing guidelines for data protection measures that meet the developmental needs of children and give them control over their personal data. It follows the principles of an important international instrument — the United Nations Convention on the Rights of the Child (UNCRC). In fact, the Children’s Code supports the definition of a child under the convention (i.e., a person under 18), expanding the range of data subjects warranting special protection beyond those under the age of 13.

The ultimate purpose of the Children’s Code is to ensure that online services likely to be accessed by children are suitable for their use and meet their development needs. It outlines 15 standards of age-appropriate design that follow a risk-based approach with a view to providing children with the best possible access to online services while minimising data collection and usage by default.

The Children's Code places a strong emphasis on the best interests of children, ensuring that only the minimum necessary personal information is collected and stored. Sharing children's data without a valid reason is not allowed. Moreover, the Children’s Code includes a requirement to set privacy settings by default to the highest level unless there is a ‘compelling reason’ to do otherwise, and children and their parents and/or guardians need to be granted greater control over privacy settings. The Children’s Code also requires that profiling and geolocation options be disabled by default unless it is in the best interests of the child to enable it. At the same time, it prohibits any nudge techniques that encourage children to give more personal data or to turn off privacy protections (the so-called ‘dark patterns’).

Naturally, the transparency principle continues to remain in the limelight. More specifically, it requires that information provided to children be clear, prominent, and written in language that is appropriate for their age.

Why Children’s Code is significant and what does it mean for me?

As highlighted above, the Children’s Code has a very specific scope of application. The initial step is likely to involve examining your current services to establish if they fall under its ambit. If your products/services are indeed subject to the Children’s Code, you will need to ensure that you are aligned with the code’s standards. As the starting point, this could include promptly conducting a DPIA to assess the risks related to the processing.

Your online services will need to establish the age of each individual user with a level of certainty that is appropriate to the risks that arise from your data processing. Cognisant that each service is unique, the ICO avoids specifying what level of certainty different methods of age assurance provide. This means that the onus is ultimately on controllers to demonstrate accountability – i.e., their compliance with the data protection requirements.

The decisions related to children must be well-reasoned, justified and explained. To assist organisations, the ICO has created a self-assessment risk tool to assist medium to large private, public and third-sector organisations in conducting their own risk assessment and to determine the impact of the UK GDPR and the Children's Code on their digital service. The tool provides practical steps for organisations to adopt a risk-based approach to ensuring the protection and privacy of children. Additionally, the toolkit displays each standard and its relevant article under the UK GDPR and UNCRC, clearly highlighting the risks and harm to children. This allows organisations to tailor their assessments to address the specific risk areas identified for children. We note that the ICO provides multiple guidelines and tools for organisations to complete the best interests of the child's self-assessment at different stages of the assessment cycle. For example, the regulator has developed The best interests of the child roadmap, Privacy moments maps and Best interests framework for organisations that can tailor these tools to fit their organisational needs.

Finally, remember that the ‘best interest of the child’ should be your guiding principle when dealing with children's personal information. To implement this standard, you should consider the needs of child users and determine how to best support those needs in your online service design while processing their personal data. Organisations must remember that adopting a child-centred approach to processing children's personal data extends beyond merely complying with data protection laws and focusing solely on ‘protection’. It entails balancing the interests of children and the online service provider while keeping the interests of children at the forefront.

The failure to comply with the Children’s Code and, more broadly, data protection law as it relates to children may result in significant financial consequences for your organisation as well as damage to its reputation. In this regard, we would like to draw your attention to the recent enforcement action by the ICO against TikTok for the misuse of children’s data.

 If you have any queries or would like further information, please visit our data protection services section or contact Christopher Beveridge.

 

Subscribe: Data Privacy Updates:

Subscribe: Data Privacy Insights