Last month, the United States District Court for the Northern District of California granted NetChoice’s motion for a preliminary injunction enjoining the enforcement of AB 2273, the California Age-Appropriate Design Code Act (“CA AADC” or “Act”). The court, after reviewing ten of the Act’s provisions, held that all ten likely violated the First Amendment. CCIA filed an amicus brief in support of this motion for a preliminary injunction, as did a number of other amici, several of which were cited by the court.
The ruling comes at an important time in the U.S. youth online privacy debate, with several states proposing similar legislation loosely aimed at protecting kids and teens online. Notably, the CA AADC and similar codes largely mirror the United Kingdom’s Age-Appropriate Design Code (“UK AADC”). Despite the similarities in names, the two codes largely differ — even more so when considering the inherent differences between the UK and U.S. legal systems, especially regarding the freedom of speech. The UK AADC builds upon an existing regulatory framework and provides guidance and tools for businesses that process the personal data of UK children. Conversely, the CA AADC fails to provide any such guidance on how to operationalize these vague standards. The district court highlighted the shortcomings of the Act throughout the opinion, many of which were initially highlighted by NetChoice’s counsel.
While the litigation is still ongoing, the NetChoice v. Bonta ruling is significant — reminding lawmakers that the robust protections of the First Amendment still apply in the digital age. This post describes the CA AADC’s key provisions, summarizes the district court’s ruling, and concludes with some takeaways.
The California AADC applies to any business that “provides an online service, product, or feature likely to be accessed by children.” It adopts the same definition of “business” used by the California Consumer Privacy Protection Act and defines “children” as all consumers under the age of 18.
The Act lists several factors for determining whether a business should expect that their online service, product, or feature would “likely be accessed by children.” However, unlike the federal Children’s Online Privacy Protection Act (COPPA), whose restrictions are largely tailored to operators of websites and services directed to children (defined as those 13 and under), the CA AADC offers no such limitations. The aforementioned factors are so vaguely defined that the Act’s scope could encompass a majority of businesses operating online — including news outlets, video games, educational or credential programs, and even personal forums discussing everyday topics like testing anxiety and cooking.
The CA AADC contains a number of broad prohibitions for covered organizations — restricting businesses from collecting, selling, sharing, or retaining any personal information for most purposes. For example, the CA AADC forbids the use of personal information of a child in a way the business knows, or “has reason to know, is materially detrimental” to their mental health, physical health, or “well-being” but fails to define these vague standards.
The Act contains several obligations for covered businesses, including a requirement to complete and document a Data Protection Impact Assessment (DPIA) before new online services, products, or features that are “likely to be accessed by children” are offered to the public. The assessment must identify the purpose of the online service, product, or feature, how it uses children’s personal information, and the risks of “material detriment” to children that arise from the data management practices of the business. Businesses are then required to create a “timed plan to mitigate or eliminate” the documented risks.
In addition to the DPIA requirement, the CA AADC outlines nine other provisions for businesses to follow:
- Age Estimation. Businesses must either estimate the age of child users or apply the privacy and data protections afforded to children to all consumers.
- Default Privacy Settings. Businesses must configure the default privacy settings “provided to children” to their highest level of privacy unless the business can demonstrate a compelling reason that a different setting is in the best interests of the children.
- Enforcement of Terms and Community Standards. Businesses must enforce the terms, policies, and standards published by them.
- The “Knowingly Harmful Use of Children’s Data” Standard. Businesses must not use the personal information of children in any way that the business knows, or should know, is “materially detrimental” to their health or well-being.
- Profiling. Businesses are prohibited from profiling a child unless it can “demonstrate it has appropriate safeguards to protect children” and that the profiling is either necessary to provide the service or product, or a compelling reason why profiling is in the best interests of the children.
- Restriction on Children’s Data. Businesses must not collect, sell, share, or retain any personal information that is not necessary to provide an online service, product, or feature unless the business can demonstrate a compelling reason that doing so is in the best interests of children likely to access the product, service or feature.
- Unauthorized Use of Children’s Personal Information. Businesses may not use the collected information for any other purpose for which it was collected unless the business can demonstrate a compelling reason that its additional use is in the best interests of children.
- Dark Patterns. Businesses must not use such designs that “lead or encourage children to provide personal information” beyond what is reasonably necessary or “to forego privacy protections, or to take any action that the business knows, or has reason to know, is materially detrimental to the child’s health or well-being.”
Covered businesses would need to have completed the required DPIA reports and comply with the other provisions by July 1, 2024, when the Act was set to go into effect. The Act authorizes the state Attorney General to bring a civil enforcement action against businesses in non-compliance. Violators are subject to civil penalties of $2,500 per affected child for each negligent violation or $7,500 for each intentional violation — the Act provides no cap on civil penalties.
District Court Ruling
Last December, NetChoice filed a suit challenging the CA AADC as facially unconstitutional. The complaint alleged that the Act violated (1) the First and Fourteenth Amendments to the Constitution, (2) the Fourth Amendment to the Constitution, (3) the void for vagueness doctrine under the First Amendment, (4) the dormant Commerce Clause; and is preempted by (5) COPPA, and (6) Section 230 of the Communications Act. NetChoice also sought a preliminary injunction, arguing that the CA AADC violates the First Amendment as it is an unlawful prior restraint on protected speech, is constitutionally overly broad, and regulates protected expression in a manner that fails to survive judicial review.
On September 18, 2023, the district court enjoined California Attorney General Rob Bonta from enforcing the CA AADC, which was signed into law just three days prior. The court found that NetChoice had demonstrated a likelihood of success on its First Amendment claim — the court did not address the other claims in the complaint.
The court’s assessment of whether the CA AADC regulates protected expression was broken into two inquiries — a review of the Act’s prohibitions and obligations to determine whether it regulated conduct or speech. First, the court found that the Act’s prohibitions on the use of personal information regulated speech. The court described how the Act limited the “availability and use” of information by specific speakers and for certain purposes. Second, the court found that the Act’s mandates including those regarding DPIAs and age assurance regulate speech. The court explained that several provisions of the Act require businesses to affirmatively disclose information to users and the government, finding the state’s argument that requiring businesses to consider design features is unrelated to speech to be unpersuasive.
After finding that the Act likely regulates speech, the court applied the standard of intermediate scrutiny for commercial speech — requiring the state to show the state has a substantial interest, the regulation directly advances the interest, and is not done in a manner more extensive than is necessary to achieve the interest. Despite applying the lesser standard, the court found ten of the Act’s provisions likely violate the First Amendment. Specifically, the court deemed that the aforementioned provisions, including the DPIAs, were unlikely to survive commercial speech scrutiny. Lastly, the court concluded that given it cannot sever the “likely invalid portions of the statute”, it enjoined the entire Act. The preliminary injunction took effect immediately following the district court’s ruling.
The district court’s ruling is certain to have an impact beyond California.
While state lawmakers may have a substantial interest in protecting the health and well-being of minors, such regulations must be appropriately tailored to achieve such interest. For example, the state alleged that the CA AADC advances its substantial interest by protecting children from being harmed by lax data and privacy protections. However, the court explained that the DPIAs failed to achieve this aim, “because the DPIA report provisions do not require businesses to assess the potential harm of the design of digital products, services, and features, and also do not require actual mitigation of any identified risks, the State has not shown that these provisions will “in fact alleviate [the identified harms] to a material degree.” This is especially important considering the growing number of approaches that include age-estimation or verification and similar provisions.
Second, this ruling may call into question other online safety and privacy laws in the U.S. Several of the provisions reviewed by the district court like dark patterns and non-profit exemptions are standard elements in such laws. For instance, the court questioned the CA AADC’s purpose limitations — using a child’s personal information for any reason other than for which it was collected. The court found no evidence of harm to children’s wellbeing from the use of personal information for multiple purposes.
Lastly, while the CA AADC differs in its scope and other details, this ruling may impact other technology-adjacent legislative efforts. Currently, several states are attempting to regulate the use of artificial intelligence and related tools including algorithms and automated-decision making systems. Such proposals also similarly require organizations to conduct risk assessments but fail to sufficiently define any of the language, creating vague standards that are difficult to operationalize.
The district court’s ruling can help inform future legislative efforts to avoid harming children and infringing upon constitutionally protected speech. Creating a safer online environment for all is a worthwhile goal but it should not come at the expense of disempowering families, conditioning internet access upon facial scans, or worse.