The ICO’s Age Appropriate Design Code: Transparency and Fairness
This post discusses Standards 3, 5 and 13 of the draft Age Appropriate Design: A Code of Practice for Online Services published by the UK’s Information Commissioner’s Office’s (ICO).
Standard 3 would require all privacy disclosures, as well as other published terms, policies and community standards to be “concise, prominent and in clear language suited to the age of the child[ren]” who are likely to use the online service. In addition, just-in-time “bite-size” notices are strongly recommended. For connected toys and devices, Standard 13 of the Code would require “clear information indicating that the product processes personal data at the point of sale and prior to device set-up,” including on the packaging of the physical product and on the product leaflet or instruction booklet.
Robust, detailed information would always be required for parents. The Code, however, would also require age-appropriate disclosures to children, for which the ICO provides specific guidelines broken down by age group. (As discussed in a prior post, “children” includes all users under 18 years old.) For example, for users ages 6 through 9, the Code would require cartoons, videos or audio materials to be provided to sit alongside disclosures meant for parents; addresses the content of those materials, as well as the resources provided to parents; and addresses how to handle a child’s attempt to change a default high privacy setting. These guidelines differ for each age group.
Perhaps most importantly, Standard 3 (along with Standard 5) demonstrates the ICO’s intent to reach beyond privacy in the name of “fairness.” Specifically, in Standard 3, the ICO states that “If you aren’t clear, open and honest about the services that you provide and the rules that govern that service, then your original collection and ongoing use of the child’s personal data is unlikely to be fair.” The ICO flushes this out further in Standard 5, where it utilizes the “fairness” requirement in Article 5(1) of the UK Data Protection Act 2018 (DPA 2018) to shoehorn in non-privacy requirements. Specifically, the Code states that if a provider does not actively enforce and uphold its own published terms, policies and community standards (including, but not limited to, privacy policies, age restrictions, behavioral rules, and content policies), then the collection of personal data could be deemed “unfair” and, therefore, a violation of Article 5(1).
By way of example, the Code states that if a provider says it actively monitors user behavior, or offers real time, automated, or human moderation of chat functions, then it would be required to do so. Conversely, if the controller relies on “back-end” processes, such as user reporting, to identify behavior that breaches its policies, then it would be required to make that clear in its disclosures. Moreover, the Code states that “If the risks are high then ‘light touch’ or ‘back end only’ processes to uphold your standards are unlikely to be sufficient.” For example, “If you say that you will not tolerate bullying then you need to have adequate mechanisms in place to swiftly and effectively deal with bullying incidents.”