A New Season for Kids’ Privacy: Court enjoins California’s Landmark Youth Privacy Law — Protecting Children Online Remains a Prime Concern
Summer is definitely over. With the autumnal equinox just days away (Saturday, September 23, to be exact), there’s been a definite shift in the air – and in the children’s privacy world. Just as the fastest sunsets and sunrises of the year happen at the equinoxes, kids’ privacy developments are piling on rapidly right now.
Since the beginning of September, we’ve seen the Irish Data Protection Commission issue a huge, €345 million ($367 million) fine against TikTok for using unfair design practices that violate kids’ privacy. Delaware’s governor just signed a new privacy law that bans profiling and targeted advertising for users under the age of 18 unless they opt-in. And the Dutch data protection authority, just this week, announced an investigation into businesses’ use of generative AI in apps directed at young children.
As I was catching up with these matters yesterday, news broke that a federal district court judge in California had granted a preliminary injunction (“PI”) prohibiting the landmark California Age Appropriate Design Code Act (“CAADCA”) from going into effect on July 1, 2024. The judge ruled that the law violates the First Amendment’s free speech guarantees.
As ESRB Privacy Certified blog readers might recall, in September 2022, California enacted the CAADCA, establishing a far-reaching privacy framework that requires businesses to prioritize the “best interests of the child” when designing, developing, and providing online services. At the time, I wrote that the California law had the “potential to transform data privacy protections for children and teens in the United States.”
In particular, I pointed to the law’s coverage of children under the age of 18, its applicability to all online services “likely to be accessed by a minor,” and its requirement that businesses set default privacy settings that offer a “high level” of privacy protection (e.g., turning off geolocation and app tracking settings) unless the business can present a “compelling reason” that different settings are in the best interests of children. I also noted the Act’s provisions on age estimation/verification, data protection impact assessments (“DPIAs”), and data minimization as significant features.
In December 2022, tech industry organization NetChoice filed a lawsuit challenging the CAADCA on a wide range of constitutional and other grounds. In addition to a cluster of First Amendment arguments, NetChoice asserted that the Children’s Online Privacy Protection Act (“COPPA”), which is enforced primarily by the Federal Trade Commission (“FTC”), preempts the California law. The State of California, represented by the Office of the Attorney General, defended the law, arguing that the “Act operates well within constitutional parameters.”
Yesterday’s PI shifts the “atmospherics” of the kids’ privacy landscape dramatically. But the injunction doesn’t mean that businesses and privacy practitioners can ignore the underlying reasons for the CAADCA (which was passed overwhelmingly by the California legislature) or the practices and provisions it contains. Here’s a very rough analysis of the decision and some tips about what it might mean for your kids’ privacy program.
The Court’s Holding: In her 45-page written opinion, Judge Beth Labson Freeman held that “NetChoice has shown that it is likely to succeed on the merits of its argument that the provisions of the CAADCA intended to achieve [the purpose of protecting children when they are online] likely violates the First Amendment.” The Court held that the CAADCA is a regulation of protected expression, and not simply a regulation of non-expressive conduct, i.e., activity without a significant expressive element. Because she viewed the statute as implicating “commercial speech,” the Court analyzed the CAADCA under an “intermediate scrutiny standard of review.”
The Relevant Test: Under that standard (often referred to as the Central Hudson test based on the name of the Supreme Court case that formulated it), if the challenged regulation concerns lawful activity and speech that is not misleading, the government bears the burden of proving that (i) it has a “substantial interest” in the regulation advanced, (ii) that the regulation directly and materially advance the government’s substantial interest, and (iii) that the regulation is “narrowly tailored” to achieve that interest.
The Court recognized that California would likely succeed in establishing a substantial interest in protecting minors from harms to their physical and psychological well-being caused by lax data and privacy protections online. Reviewing the CAADCA’s specific provisions, however, it found that that many of the provisions challenged by NetChoice did not meet the remaining prongs of the intermediate scrutiny test.
The Court’s Central Hudson Analysis: The Court made findings on each of the specific provisions challenged by NetChoice keyed to the Central Hudson factors. I highlight a few here:
- Data Protection Impact Assessments (DPIAs): The Court held that California did not meet its burden to demonstrate that the requirement for businesses to assess their practices in DPIAs would alleviate any harms from the design of digital products, services, and features, to a material degree.
- Age Estimation: Judge Freeman also found that the statutory requirement to estimate the age of child users with a “reasonable level of certainty” would likely fail the Central Hudson test: “[T]he CAADCA’s age estimation provision appears not only unlikely to materially alleviate the harm of insufficient data and privacy protections for children, but actually likely to exacerbate the problem by inducing covered businesses to require consumers, including children, to divulge additional personal information.”
- The Court also found that the age estimation provision would likely fail to meet the Central Hudson test because the effect of a business choosing not to estimate age, but instead to apply privacy and data protections broadly, would impermissibly shield adults from that same content. In reaching this conclusion, Judge Freeman rejected California’s argument that the “CAADCA does not prevent any specific content from being displayed to a consumer, even if the consumer is a minor; it only prohibits a business from profiling a minor and using that information to provide targeted content.”
- Notably, later in the decision, Judge Freeman held that the age estimation provision is the “linchpin” of most of most of the CAADCA’s provisions and therefore determined it is not “functionally severable” from the remainder of the statute.
- High Default Privacy Settings: The Court found that the CAADCA’s requirement for “high default privacy settings” would be likely to cause at least some businesses to prohibit children from accessing their services and products altogether.
- Profiling by Default: Here, Judge Freeman held that the provision banning profiling of children by default could discard “beneficial aspects” of targeted information to certain categories of children, e.g., pregnant teenagers.
- Dark Patterns: The Judge held that California did not meet its burden to establish that prohibitions on the use of dark patterns to lead or encourage children to provide unnecessary personal information would ameliorate a causally connected harm.
COPPA Preemption: Although the Court granted the injunction based on First Amendment considerations alone, it did, briefly, address NetChoice’s argument that the COPPA preempts the CAADCA. The Court rejected this argument at the PI stage, explaining: “In the Court’s view, it is not clear that the cited provisions of the CAADCA contradict, rather than supplement, those of COPPA. Nor is it clear that the cited provisions of the CAADCA would stand as an obstacle to enforcement of COPPA. An online provider might well be able to comply with the provisions of both the CAADCA and COPPA . . . . “
- N.B. Judge Freeman’s decision to act cautiously on this claim makes sense. Recently, the Ninth Circuit Court of Appeals, in Google v. Jones, overturned her decision that COPPA preempted state law claims asserted in a class action alleging that Google/You Tube used persistent identifiers to collect data and track children’s online behavior surreptitiously and without their consent – conduct that also violates COPPA. Interestingly, in that case, the Ninth Circuit invited the FTC, which enforces COPPA, to express its views on the preemption issue. The FTC accepted, stating that “Congress did not intend to wholly foreclose state protection of children’s online privacy, and the panel properly rejected an interpretation of COPPA that would achieve that outcome.”
Takeaways: The CAADCA litigation is far from over, and it is likely that the California Attorney General will seek an immediate interlocutory appeal. It is clear, though, that the district court’s decision will have consequences in the short term for state privacy laws that are scheduled to come into effect soon as well as for efforts underway in Congress on child-related online privacy and safety legislation. Here are a few takeaways:
- Privacy Laws Can Still Pack a Punch: Regardless of whether the Court ultimately strikes down the CAADCA or not, many of the concepts in the design code are already embedded in other privacy laws that apply to game and toy companies’ activities, both without and within the United States. On the U.S. front, there are newly enacted child privacy provisions in state laws that should be able to withstand constitutional challenge. Plus, the NetChoice ruling might loosen the California’s Congressional delegation’s resistance to bipartisan federal legislation. Although today’s some may view the Court’s ruling as a reprieve, companies still need to meet other legal obligations.
- For example, Connecticut recently passed child privacy amendments (scheduled to go into effect on October 1, 2024) to its privacy law that skirt some of the elements Judge Freeman found provisionally unconstitutional. Unlike the CAADCA, the Connecticut law does not require that companies estimate the age of their users; it applies only to companies that have “actual knowledge” of or “willfully disregard” the presence of minor users, and it does not regulate “potentially harmful” (as opposed to illegal) content. Instead of using the CAADCA “best interest of the child” standard, the Connecticut law establishes a duty to avoid a “heightened risk of harm” to minors and delineates potential harms.
- DPIAs are still a “Must Do”: Most of the new state privacy laws passed in the last year contain requirements for data protection impact assessments, similar to those already required by the European Union’s General Data Protection Regulation (GDPR). At the beginning of September, the California Privacy Protection Agency published draft regulations that contain practical examples of how DPIAs should work under California’s comprehensive privacy law. Regardless of what happens with the CAADCA, statutory requirements for more focused DPIAs such as those in the California Consumer Privacy Act will likely remain.
- Judge Freeman’s skepticism about the CAADCA’s DPIA provision aside, DPIAs can be a useful accountability tool for identifying privacy risks, working out when, where, and how likely they are to occur, and assessing the impact of such risks on your customers and business.
- COPPA Continues to Be Relevant: It will probably take years for the court battle over the CAADCA to play out. In the meantime, if you know that children — or teenagers — are using your products, expect the FTC to enforce COPPA and other privacy protections aggressively. (For quick review of the FTC’s recent COPPA cases, see my previous blog post COPPA Battlegrounds: The Quest to Uncover the Secrets of the FTC’s Kids’ Privacy Actions.)
- Indeed, it’s likely the FTC will use both the substantive provisions of COPPA and the “unfairness” and “deception” prongs of Section 5 of the FTC Act to set requirements for child-friendly privacy disclosures, mandates for high privacy default settings, and prohibitions against manipulative dark patterns through its child-focused investigations and enforcement actions.
- The NetChoice ruling – coupled with Congressional inaction – could also spur the FTC to complete its now-four-years-old COPPA Rule review and act on (at least parts of) last year’s privacy rulemaking proposal.
While this all unfolds, ESRB Privacy Certified will continue to help its program members comply with existing laws and adopt and implement best practices for children’s privacy. As privacy protections for kids and teens continue to evolve, we’ll be following closely and providing guidance to our program members on all of the moving parts of the complex children’s privacy landscape. To learn more about ESRB Privacy Certified’s compliance and certification program, please visit our website, find us on LinkedIn, or contact us at [email protected].
• • •
As senior vice president of ESRB Privacy Certified (EPC), Stacy Feuer ensures that member companies in the video game and toy industries adopt and maintain lawful, transparent, and responsible data collection and privacy policies and practices for their websites, mobile apps, and online services. She oversees compliance with ESRB’s privacy certifications, including its “Kids Certified” seal, which is an approved Safe Harbor program under the Federal Trade Commission’s Children’s Online Privacy Protection Act (COPPA) Rule.