The Age appropriate design code, also known as the Children's Code, is a British internet safety and privacy code of practice created by the Information Commissioner's Office (ICO). The draft Code was published in April 2019,[1] [2] as instructed by the Data Protection Act 2018 (DPA).[3] The final regulations were published on 27 January 2020 and took effect 2 September 2020, with a one-year grace period before the beginning of enforcement.[4] [5] The Children's Code is written to be consistent with GDPR and the DPA, meaning that compliance with the Code is enforceable under the latter.
It applies to any internet-connected product or service that is likely to be accessed by a person under the age of 18. It requires online services to be designed in the "best interests" of children and their health, safety, and privacy, requiring that they be afforded with the strongest privacy settings by default, that only data strictly necessary to deliver individual service elements is collected from children unless there is justification, and that children's personal data not be disclosed to third-parties unless there is justification. It also requires privacy policies and controls to be presented in a manner that is clear and accessible to children, including prohibiting dark patterns.
Baroness Beeban Kidron sponsored the amendment to the DPA that mandated the development of the Code.[6] Upon the implementation of the Code in 2021, she explained that "[the Code] shows tech companies are not exempt. This exceptionalism that has defined the last decade, that they are different, just disappears in a puff of smoke when you say, 'actually, this is business.' And business has to be safe, equitable, run along rules that at a minimum protect vulnerable users.[7]
The Children's Code is a code of practice enforceable under the Data Protection Act 2018, and is consistent with GDPR and the Convention on the Rights of the Child. It specifies design standards for any information society services (ISS, which includes websites, software and apps, and connected toys) that are likely to be used by a person under the age of 18 and is based in or serves users within the United Kingdom.
The Code requires that services be designed in "the best interests" of children, including their physical and mental health, protecting them from being exploited commercially or sexually, and acknowledging parents and caregivers' roles in protecting and supporting their child's best interests.
The Code specifies that when used by a child, online services must use their highest privacy settings by default, unless there is a compelling reason to do so while keeping into account the best interests of the child. This includes not allowing access to data by other users, location tracking, or behavioural profiling (such as algorithmic curation and targeted advertising, or using data "in a way that incentivises children to stay engaged"). The amount of data collected from children must be minimized, only collecting data that is strictly necessary to deliver service elements that a child is "actively and knowingly engaged" in. A service may not disclose a child's personal data to a third party without a compelling reason to do so.
Services must present their privacy policy, privacy options, and data export and erasure tools in clear and age-appropriate means. They must not use dark patterns to nudge children toward options that reduce their privacy. The Code recommends that privacy settings and tools be tailored to the needs of specific age groups. Per GDPR, a user must be at least 13 years old to give verifiable consent to data processing; verifiable consent must be given by the child's parent or custodian.[8] [9]
Social media services adjusted their services to comply with the Code; on Instagram, all accounts created by under-18s began to be marked as private by default, and adults may not direct message them unless they are followers. TikTok stated that it will not send push notifications to children during the evening and nighttime hours, while YouTube stated that it would treat all videos "made for kids" (a designation introduced in 2020 following a ruling and fine under the U.S. Children's Online Privacy Protection Act)[10] [11] under the assumption they were being viewed by a child, including disabling autoplay, personalization, targeted advertising, and social features.[12] [13]
In March 2023, a complaint was filed against YouTube alleging violations of the Code, as the service can track children via devices shared by multiple users.[14]
The code was adapted by the U.S. state of California as AB 2273, The California Age-Appropriate Design Code Act, and passed in August 2022. Kidron's charity 5Rights Foundation was credited as a supporter and "co-source" of the bill. In September 2023, the bill was ruled unconstitutional by Federal Judge Beth Labson Freeman as a violation of the First Amendment.[15] [16] [17]