The Children’s Code: Protecting children’s privacy online
In case you missed it, the Children’s Code (formally known as the Age Appropriate Design Code) came into force on 2nd September and organisations have a twelve-month transition period to comply.
Protecting children’s online privacy may be an area which is clearly relevant to your business and complying with the code will already be firmly on your agenda. For others, however, it may not be so obvious.
What’s made clear is it’s up to you to assess whether your online services are ‘likely’ to be used by children. And If you think they aren’t, you should document why.
The scope of the code is pretty broad – it applies to relevant online services likely to be accessed by children. This means it’s worth checking the following:
- What’s meant by a child?
- What online services are covered?
- What does ‘likely’ to be accessed by a child mean?
Here’s an overview to help answer these questions. But first a bit about what the code is and what it aims to achieve…
What is the Children’s Code?
It’s a statutory code of practice aimed at protecting children’s privacy when they’re online. The code introduces 15 standards organisations need to meet.
These standards are aimed at making sure online services safeguard children’s personal data. They’re not technical standards per se, instead focusing on principles and privacy features. To comply, it’s likely different technical solutions will need to be adopted by different services.
This UK code is a first, but the Information Commissioner’s Office believes it reflects the direction of travel across the world. Announcing its arrival, Elizabeth Denham the Information Commissioner said;
“A generation from now we will all be astonished that there was ever a time when there wasn’t specific regulation to protect kids online. It will be as normal as putting on a seatbelt.
“This code makes clear that kids are not like adults online, and their data needs greater protections. We want children to be online, learning and playing and experiencing the world, but with the right protections in place.
“We do understand that companies, particularly small businesses, will need support to comply with the code and that’s why we have taken the decision to give businesses a year to prepare, and why we’re offering help and support.”
We all know (especially during lockdown) children from a young age spend a lot of time online. There are real concerns every time they open an app or play a game, data may be gathered about them.
This data may then be used to encourage them to spend more time using a service or to tailor advertisements – this might be appropriate for adults but not for minors.
The ICO stresses the desire is not to restrict children from benefiting from the digital world, rather to make sure the amount of data collected about them and its use is minimised.
All businesses that fall under the code’s scope need to show they’ve taken the necessary measures. Those who don’t comply, but should, are warned they will have difficulties demonstrating compliance should the ICO come knocking and regulatory action could be more likely.
How old is a child?
The code adopts the definition of a child under the UN Convention on the Right of the Child (UNCRC) which is anyone under the age of 18. This means the code applies to online services likely to be accessed older children aged 16 and 17, not just young children.
This shouldn’t be confused with the age of a consent for a child, which for online services is 13 in the UK (but may vary across EU countries from 13 to 16).
What online services does the Children’s Code cover?
The code applies to what are termed ‘relevant information society services’. Put simply Information Society Services (ISS) means online services. ISS, according to the code, are;
“Essentially this means that most online services are ISS, including apps, programs and many websites including search engines, social media platforms, online messaging or internet based voice telephony services, online marketplaces, content streaming services (e.g. video, music or gaming services), online games, news or educational websites, and any websites offering other goods or services to users over the internet. Electronic services for controlling connected toys and other connected devices are also ISS.”
It also doesn’t matter if your service is free. For example, a free online game or search engine funded by advertising falls under the definition of an ISS.
Also, not-for-profit apps and educational sites are in-scope. Small businesses with websites selling products online or offering an online- only service via the website are also in-scope.
The code does NOT apply to:
- Some public authority services – for example, an online public service which is not provided on a commercial basis, or a police force with an online service with processes personal data for law enforcement purposes.
- Information about your business – if you operate a website which just provides details about your real-world business with no ability to buy products online or access a specific online service. An online booking form for an in-person appointment is also not an ISS.
- General broadcast services – scheduled TV or radio programmes to a general audience broadcast over the internet that are not at the request of an individual. BUT on demand services are covered.
- Preventative or counselling services – for example websites or apps which specifically provide online counselling or other preventative services to children. BUT more general health, fitness and wellbeing apps are covered.
How do you determine ‘likely’ to be accessed by a child?
Clearly if your service is aimed at under 18s you’re fully in scope, but the code also covers services which may not be specifically ‘aimed or targeted’ at children, but are likely to be accessed by them.
The word ‘likely’ was deliberately used to make sure services children were using in reality weren’t excluded.
Organisations need to therefore judge whether their services are likely or not to be used by children. The code gives us some pointers here about assessing this:
- Is the possibility of this happening more probable than not?
- Is the nature and content of the service appealing to children even if not intended for them (remember this includes older 16 and 17 year olds)?
- Do you have measures in place to prevent children gaining access to an adult only service?
Organisations are faced with a risk-based decision about whether it would be proportionate to conform or not with the code.
What about services ‘unlikely’ to be accessed by children?
There’s an important point here. The code states:
“If you decide that your service is not likely to be accessed by children and that you are therefore not going to implement the code then you should document and support your reasons for your decision. You may wish to refer to market research, current evidence on user behaviour, the user base of similar or existing services and service types and testing of access restriction measures.
If you initially judge that the service is not likely to be accessed by children, but evidence later emerges that a significant number of children are in fact accessing your service, you will need to conform to the standards in this code or review your access restrictions if you do not think it is appropriate for children to use your service.”
It’s significant that there’s an expectation organisations will need to evidence a decision not to conform with the code.
What are the 15 standards of the code?
If you judge the code does apply to you, here is a top-level summary of the 15 standards:
- Best interest of the child – you need to consider the needs of children using your service and how best to support those needs. The best interests of the child should be your primary consideration when designing and developing your online service.
- Data Protection Impact Assessments (DPIAs) – these should be carried out to identify and minimise any data protection risks to children.
- Age appropriate application – you need to assess the age range of your audience. The needs of children at different ages should be central to your design and development. You also need to make sure children are given an appropriate level of protection about how their information is used.
- Transparency – GDPR specifically states privacy information must be easy for children to find and easy for them to understand. The code says you should consider bite-sized, ‘just in time’ notices, when you are collecting children’s data.
- Detrimental use of data – you must not use children’s information in a way which would be detrimental to children’s physical or mental health and well-being. You should not use the data in a manner which would go against industry codes or practice, other regulatory provisions or Government advice.
- Policies and community standards – if children provide you with their personal information when they join or access your service, you must uphold your own published T&Cs, policies and community standards. Children should be able to expect the service to operate in the way you say it will and for you to do what you say you are doing.
- Default settings – privacy settings for children must be set to ‘high’ by default, unless you can demonstrate a compelling reason for taking a different stance. It’s not enough to allow children to activate high privacy settings, you need to provide these by default.
- Data minimisation – you should only collect and keep the minimum amount of data about children that’s necessary to provide your service.
- Data sharing – you should not share children’s data unless you can demonstrate a compelling reason to do so.
- Geolocation – any geolocation privacy settings should be switched off by default. Children should have to actively change the default setting to allow their geolocation to be used (unless you can demonstrate a compelling reason for them to be switched on).
- Parental controls – you need to make it clear to children if parental controls are in place and if they are being tracked or monitored by their parents. You should also provide information to parents about the child’s right to privacy under UNCRC.
- Profiling – options which use profiling should be switched off by default (again, unless you can demonstrate a compelling reason for these to be switched on). This doesn’t mean profiling is banned, but if it conducted you should make sure you have measures in place to protect children from harmful effects. If profiling is part of the service you are providing you need to be sure this is completely necessary.
- Nudge techniques – you should not use techniques which lead or encourage children to activate options which mean they give you more of their personal information or turn off privacy protections.
- Connected toys and devices – you need to conform with the code if you sell toys or devices (such as talking teddy bears, fitness bands or home-hub interactive speakers) which collect personal data and transmit it via a network connection. If your electronic toys or devices do not connect to the internet and only store data in the device itself the code does not apply.
- Online tools – it must be easy for children to exercise their data protection rights and report any concerns. The right of erasure is particular relevant when it comes to a children. Mechanisms used to help children exercise their rights should be age appropriate, prominent and easy to use.
The Code is a significant step-change in online safeguarding – it impacts on many businesses who’ve never had to consider whether children were interacting with their online platforms.
As ever, proportionate but effective tools and policies will be required to show you’ve given due regard to the code, especially by those businesses likely to attract younger users. Risk-based decisions will be required, based on evidence, adding to existing impact assessments.
The full code can be found here: The Children’s Code
Philippa Donn, September 2020
Here at the DPN we’ll keep abreast of any developments in this area, standing ready to assist any organisation requiring further guidance or assistance. Get in touch for an informal chat.
The information provided and the opinions expressed in this document represent the views of the Data Protection Network. They do not constitute legal advice and cannot be construed as offering comprehensive guidance on the EU General Data Protection Regulation (GDPR) or other statutory measures referred to.