Online services face scrutiny over use of children’s data
The importance of compliance with the UK Children’s Code
Social media platforms. Content streaming services. Online gaming. Apps. Any online services popular with children carry an inherent privacy risk.
Along with growing concerns over protecting them from harmful content, there’s an increasing focus on children’s privacy. Beyond the companies you’d expect to be impacted, it’s worth remembering these issues can affect a growing number of other organisations.
We know children are being exposed to inappropriate and harmful content. Some content is illegal like child sexual abuse images or content promoting terrorism, but other material can still cause harm. Such as content promoting eating disorders, or content which is inappropriate for the age of children viewing it, or is overly influential.
The Information Commissioner’s Office (ICO) recently launched an investigation into TikTok, amid concerns at how the platform uses children’s data – specifically around how their data is used to deliver content into their feeds. The regulator is also investigating the image sharing website Imgur and the social media platform Reddit, in relation to their use children’s data and their age verification practices.
These investigations are part of wider interventions into how social media and video sharing platforms use information gathered about children and the ICO say it’s determined to continue its drive to make sure companies change their approach to children’s online privacy, in line with the Children’s Code, which came into force in 2021.
This all serves as a timely reminder of the need to comply with this Code.
What is the Children’s Code?
The Children’s Code (aka ‘Age-Appropriate Design Code’) is a statutory code of practice aimed at protecting children’s privacy online. It sets out how to approach age-appropriate design and gives fifteen standards organisations are expected to meet. These are not necessarily technical standards, but more principles and required privacy features.
Who does the Children’s Code apply to?
A wide range of online services are within scope, including apps, search engines, online games, online marketplaces, connected toys and devices, news and educational sites, online messaging services and much more.
If children are ‘likely’ to access your online service(s), even if they are not your target audience, the code applies. For example, free services, small businesses, not-for-profits and educational sites are all in scope. Companies need to ask themselves – is a child likely to use our product or service online?
What ages does it apply to?
The code adopts the definition of a child under the UN Convention on the Rights of the Child (UNCRC) which is anyone under the age of 18. This means it applies to online services likely to be accessed by older children aged 16 and 17, not just young children. (This shouldn’t be confused with the age of consent for a child, which for online services is 13 in the UK).
Who does the Children’s Code not apply to?
Some public authority services are out of scope. For example, an online public service which is not provided on a commercial basis, or a police force with an online service with processes personal data for law enforcement purposes. Preventative or counselling services are also not in scope, such as websites or apps which specifically provide online counselling or other preventative services to children. However, more general health, fitness and wellbeing apps are in scope.
How do you assess ‘likely to be accessed’ by a child?
Crucially, the Code covers services which may not be specifically ‘aimed or targeted’ at children, but are ‘likely’ to be accessed by them. Each provider will need to assess this, and the Code provides some questions to help you:
■ Is the possibility of children using your service more probable than not?
■ Is the nature and content of the service appealing to children even if not intended for them? (Remember this includes older 16-17 year olds)
■ Do you have measures in place to prevent children gaining access to an adult only service?
This assessment may not always be clear cut, but it’s worth nothing the Code states:
If you decide that your service is not likely to be accessed by children and that you are therefore not going to implement the code then you should document and support your reasons for your decision. You may wish to refer to market research, current evidence on user behaviour, the user base of similar or existing services and service types and testing of access restriction measures.
If you initially judge that the service is not likely to be accessed by children, but evidence later emerges that a significant number of children are in fact accessing your service, you will need to conform to the standards in this code or review your access restrictions if you do not think it is appropriate for children to use your service.
This means there’s a clear expectation online services may need to have evidence if they decide they do not need to conform with the Code.
The 15 standards of the Children’s Code
The Code is extremely detailed. Here’s a summary of the salient points:
1. Best interest of the child – consider the needs of children using your service and how best to support those needs. The best interests of the child should be a primary consideration when designing and developing an online service.
2. Data Protection Impact Assessments (DPIAs) – these should be carried out to identify and minimise any data protection risks to children.
3. Age-appropriate application – assess the age range of your audience. Remember, the needs of children of different ages should be central to design and development. Make sure children are given an appropriate level of protection about how their information is used.
4. Transparency – UK GDPR specifically states privacy information must be easy for children to find and easy for them to understand. The Code says you should consider bite-sized ‘just in time’ notices when collecting children’s data.
5. Detrimental use of data – don’t use children’s information in a way which would be detrimental to children’s physical or mental health and well-being.
6. Policies and community standards – if children provide you with their personal information when they join or access your service, you must uphold your own published T&Cs, policies and community standards. Children should be able to expect the service to operate in the way you say it will and for you to do what you say you are doing.
7. Default settings – privacy settings for children must be set to ‘high’ by default, unless you can demonstrate a compelling reason for taking a different stance. It’s not enough to allow children to activate high privacy settings, you need to provide these by default.
8. Data minimisation – only collect and keep the minimum amount of data about children that’s necessary for you to provide your service.
9. Data sharing – do not share children’s data unless you can demonstrate a compelling reason to do so. The word ‘compelling’ is significant here. It means the bar for non-compliance is set very high.
10. Geolocation – any geolocation privacy settings should be switched off by default. Children should have to actively change the default setting to allow their geolocation to be used (unless you can demonstrate a compelling reason for this to be switched on).
11. Parental controls –make it clear to children if parental controls are in place and if children are being tracked or monitored by their parents.
12. Profiling – options which use profiling should be switched off by default (again, unless you can demonstrate a compelling reason for these to be switched on). This doesn’t mean profiling is banned, but if conducted measures should be in place to protect children from harmful effects. If profiling is part of the service you provide, you need to be sure this is completely necessary.
13. Nudge techniques – don’t use techniques which lead or encourage children to activate options which mean they give you more personal information, or turn off privacy protections.
14. Connected toys and devices – you need to conform with the code if you sell toys or devices (such as talking teddy bears, fitness bands or home-hub interactive speakers) which collect personal data and transmit via a network connection. If your electronic toys or devices do not connect to the internet and only store data in the device itself, the code does not apply.
15. Online tools – it must be easy for children to exercise their privacy rights and report concerns. The right to erasure is particularly relevant when it comes to a children. Mechanisms used to help children exercise their rights should be age appropriate, prominent and easy to use.
The ICO investigations into TikTok, Imgur and Reddit should be seen as a direction of travel for us all. Going forward, they signal how the regulator intends to treat compliance around children’s online privacy, welfare and safeguarding.
If your service uses children’s data even tangentially, it’s worth remembering, re-examining and considering the Code and how it might impact on your business.