Children and the GDPR: how will the new ICO Code protect children online?

The dark nature of the internet has become all too prevalent over the last decade. In a study conducted by Ofcom, around 1 in 10 children (aged 8-11) say they have encountered distressing or worrying content online (Ofcom 2016). Law enforcement agencies and organisations such as NSPCC, have addressed the growing concern of online vulnerability and grooming of children. In the UK alone, it is estimated that near 80,000 UK residents pose a sexual threat to children online. Separate figures illustrated that between 2017 and 2018, police in England and Wales recorded 23 online child sexual offences every day– up from 15 a day in the previous 12 months.

The key concern is not just internet vulnerability or safety concerns for young children but also, the exposure to harmful or distressing content. With children now learning how to use a remote control or tablet before they can walk, stricter measure need to be in place to protect their rights.

The ICO and the new Code

The Information Commissioner’s Office (‘ICO’) is an independent regulatory office that aims to “uphold information rights in the public interest”. While they are also responsible for enforcing the GDPR, they have introduced a new Age Appropriate Design Code to protect children online. The Code itself consists of fifteen measures that will afford better protection for young people when they spend time online, whether that be on social media platforms, through online gaming or on the internet more generally.

While the GDPR does require “special treatment” for children, the fifteen additions through this new Code were created with the intention of providing greater consistency, clarity and a base level protection in the design and implementation of games, apps, websites and specifically, through social media. 

The Secretary of State laid the new Code to Parliament under section 125(1)(b) of the Data Protection Act 2018. The ICO issued the Code in August and it came into force on 2 September 2020 with a twelve-month transition period. The transition period is enforced to allow businesses adequate time to update their practices and ensure their content abides by the new legislation.

What are the fifteen measures?

The 15 standards of the Age Appropriate Design Code are as follows:

  1. Best interests of the child
  2. Data protection impact assessments
  3. Age appropriate application
  4. Transparency
  5. Detrimental use of data
  6. Policies and community standards
  7. Default settings
  8. Data minimisation
  9. Data Sharing
  10. Geolocation
  11. Parental controls
  12. Profiling
  13. Nudge techniques
  14. Connected toys and devices
  15. Online tools

Essentially, the new Code requires key organisations and companies to take the GDPR principles into account while applying them through the lens of ‘the best interests of the child’.

How will this affect businesses and organisations?

The ICO Code is a risk-based system; as such, not all organisations will be impacted the same. For developers of online products and services, the ICO will issue serious enforcement actions in the form of fines if they fail to comply within the required 12-month period. Social media platforms, education websites, online games, streaming services and apps will be the most severely hit, given the circumstances following the pandemic. For those using online advertising, this may affect their ability to share data with specific ad-tech companies.

The ICO’s enforcement powers include a fine of up to £17.5m or 4% of worldwide turnover, which is higher when firms breach GDPR guidelines. However, in the case of data breaches involving children, they have warned organisations that if they see “harm or potential harm to children” they will “likely take more severe action against a company than would be the case for other types of personal data.”

How will the new measures protect children online?

The ICO argues that the new ‘design-led’ approach is a key aspect of the reform, particularly since children are spending more time online following the Covid-19 pandemic. The Code makes it clear that children are not adults and therefore, stricter protections need to be put into place to ensure that, with the progress of technology, children are not subject to undue harm or risk.

As the standards include greater transparency, restrictions on data sharing and stricter policies and community standards, many games and social media developers will need to apply the risk and data protection impact assessment to ensure children are adequately protected. Over the next 12 months, the previous online landscape will change to better protect the rights of children.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Article

EU vs. Big Tech

Next Article

The Fallout of Ruth Bader Ginsburg

Related Posts