The Information Commissioner’s Office (ICO), an independent regulatory office that aims to “uphold information rights in the public interest”, has published a new Age Appropriate Design Code to protect children online.
The code, the draft of which was first reported in April 2019, is scheduled to come into force by autumn 2021.
It consists of 15 measures that it states will provide better protection for young people when they are spending time online, whether they are using apps, browsing social media platforms or playing online games.
Elizabeth Denham, information commissioner, told the PA News Agency that she believes the implementation of the code will be “transformational”.
“I think in a generation from now when my grandchildren have children they will be astonished to think that we ever didn’t protect kids online,” Ms Denham said.
“I think it will be as ordinary as keeping children safe by putting on a seat belt.”
Ms Denham added that while GDPR “requires special treatment of children”, the 15 standards outlined by the new code “will bring about greater consistency and a base level of protection in the design and implementation of games and apps and websites and social media”.
Andy Burrows, head of child safety online policy at the NSPCC, said the code will require tech companies to “assess their sites for sexual abuse risks” and prevent them from permitting “harmful self-harm and pro-suicide content” on their sites for the first time.
“It is now key that these measures are enforced in a proportionate and targeted way,” Mr Burrows stated.
Here are the 15 measures that are being put into place as part of the ICO’s Age Appropriate Design Code:
1. Best interests of the child
In accordance with the United Nations Convention on the Rights of the Child, the age appropriate code emphasises that the “best interests of the child should be a primary consideration”.
2. Data protection impact assessments
This measure outlines that data protection impact assessments must be undertaken by tech firms in oder to “identify and minimise the data protection risks of your service – and in particular the specific risks to children who are likely to access your service which arise from your processing of their personal data”.
3. Age-appropriate application
The ICO states that online companies must take the age range of their users into the account, explaining that assessing the individual needs of children at various stages of development “should be at the heart of how you design your service and apply this code”.
Transparency, the code states, “is about being clear, open and honest with your users about what they can expect when they access your online service”.
The ICO adds that acting in a transparent manner is already outlined as part of GDPR, explaining that it is essential when processing people’s personal data.
🆕 Today we have published the Children’s Code. Elizabeth Denham, Information Commissioner comments on what this means and how it will help protect children’s privacy within the digital world.
Read more here: https://t.co/pK9EFDhLnV pic.twitter.com/snGceSi50I— ICO (@ICOnews)January 22, 2020
5. Detrimental use of data
According to the Age Appropriate Design Code, it is important for companies to refrain from using data “that is obviously detrimental to children’s physical or mental health and wellbeing or that goes against industry codes of practice".
6. Policies and community standards
This measure stated that when a tech firm has already published community rules and conditions, it is vital that they stick to these regulations.
“Keeping to your own standards should also benefit you by giving children and their parents confidence that they can trust your online service with their personal data,” the ICO says.
7. Default settings
The code states that the default privacy settings implemented by tech companies should be set at an “appropriate” manner for children.
8. Data minimisation
Data minimisation, the ICO explains, means “collecting the minimum amount of personal data that you need to deliver an individual element of your service”.
“It means you cannot collect more data than you need to provide the elements of a service the child actually wants to use,” the organisation adds.
9. Data sharing
The code outlines that taking data sharing into consideration is especially important when it comes to children, as sharing children’s personal data could put them at risk.
“The best interests of the child should be a primary consideration for you whenever you contemplate sharing children’s personal data,” it states.
The ICO stresses that the use of children’s geolocation data is “of a particular concern”, as having access to the location of a child could pose a threat to their “physical safety”.
“In short it can make children vulnerable to risks such as abduction, physical and mental abuse, sexual abuse and trafficking,” the office writes.
11. Parental controls
The ICO explains that is an online company utilises parental controls, then the child should be made aware of the controls that are in place to regulate their online activity.
The code says that profiling – which is “any form of automated processing of personal data consisting of the use of person data to evaluate certain aspects relating to a natural person” – should only be permitted if a company has enforced “appropriate measures” to protect child users.
Kids have waited a long time to get the protection they need online & @ICOnews code is a game changer – forcing tech companies to build new safety standards into design or face serious fines. Promised online harms laws are now needed to make the digital space a safer one for kids https://t.co/BUyD7FcSdE— Anne Longfield (@annelongfield)January 22, 2020
13. Nudge techniques
Nudge techniques, the ICO explains, are online cues which influence how a user may use a website, such as by encouraging them to click large, colourful buttons.
The organisation states that nudge techniques could be used to encourage children to “select less privacy-enhancing choices when personalising their privacy settings”, thus putting them and their personal data at greater risk.
14. Connected toys and devices
Some children’s toys and devices are designed to be able to connect to the internet, a feature that the ICO says raises “particular issues” due to “their scope for collecting and processing personal data”.
Online tools are “mechanisms to help children exercise their rights simply and easily”, the code outlines.