The ICO (Information Commissioners Office, the UK’s data regulator) has drafted proposals to help keep children safer when online.
Under these proposals, unveiled 21st January 2020, websites will be forced to do more to protect the safety and privacy of minors online. Named the Age Appropriate Design Code , it is hoped to come into effect by autumn 2021 if approved by parliament. Any firms disobeying these rules could be fined up to 4% of the company’s global turnover.
The move has been welcomed by many including children’s charities; for some time, big firms have been warned about children being exposed to pornography, gambling, cyberbullying and self-harm.
Andy Burrows, head of Child Safety Online Policy at NSPCC, has said the move will “force social networks to finally take online harm seriously and suffer tougher consequences if they fail to do so. For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and no longer serve up harmful self-harm and pro-suicide content. It is now key that these measures are enforced in a proportionate and targeted way”.
Information Commissioner Elizabeth Denham believes it will be transformational. She adds “I think in a generation from now when my grandchildren have children they will be astonished to think that we ever didn’t protect kids online. I think it will be as ordinary as keeping children safe by putting on a seat belt.”
Denham stated that the gaming industry and some other tech companies, however, expressed concerns regarding their business models, but overall the move was widely supported by them.
She added: “We have an existing law, GDPR, that requires special treatment of children and I think these 15 standards will bring about greater consistency and a base level of protection in the design and implementation of games and apps and websites and social media.”
The aforementioned 15 standards will include:
The best interests of a child should be of primary importance when designing and developing any online services likely to be accessed by them.
Firms will be required to assess the risks to ‘the rights and freedoms’ of children likely to access an online service also known as data protection impact assessments .
Age-appropriate application asks that a “risk-based approach to identifying the age of individual users” should be taken.
Any privacy information provided must be clear and transparent and of a nature that a child will understand.
Detrimental use of data: Children’s personal data must not be used in any way that will harm their well-being and must adhere to all regulations.
Policies and community standards: must promote and uphold all issued terms and policies.
Default settings must be set to “high privacy”.
Data minimisation: Any data collected must be kept to the minimum required to allow the child use of the service.
Data sharing: Children’s data will not be divulged unless a convincing reason to do so can be proved.
Geolocation features must not be engaged.
Children should be given age-appropriate information about parental controls .
Profiling options will be switched off by default. These options will only be allowed if there is no risk of the child being harmed by certain content.
Nudge techniques must not be used to encourage a minor to release needless personal information or that may result in them turning off privacy protection settings.
Any connected toys and devices should comprise of sufficient tools to ensure they follow the code.
Online tools: Youngsters will be given obvious and easy to use tools to ensure their data protection is safe.
For further and more in-depth information, please visit: