Home

Donate
Perspective

The Ninth Circuit Provides a Potential Roadmap for Future Child Safety Laws

Ariel Fox Johnson / May 15, 2026

Ariel Fox Johnson is an attorney and senior advisor on privacy policy for Common Sense Media, a national non-profit based in San Francisco. She filed an amicus brief on behalf of Common Sense Media in support of California’s Age Appropriate Design Code in Netchoice v. Bonta.

Over the past year, the Ninth Circuit has largely rejected the tech industry trade association NetChoice’s attempts to invalidate two of California’s landmark child privacy and safety laws. In September 2025, the court upheld California’s Protecting Our Kids from Social Media Addiction Act (SB 976). With its March 2026 decision, it has also twice vacated most of the lower court’s preliminary injunction against the California Age-Appropriate Design Code, which was signed into law in 2023.

Together, these decisions mark an important shift in the legal landscape surrounding online child safety. While the rulings do not guarantee the success of legislation, they could provide a roadmap for advocates and lawmakers seeking to enact and protect online child privacy and safety laws.

The court’s ruling in NetChoice v. Bonta (CAADCII)

In its March 12 ruling in NetChoice v. Bonta, the court held that NetChoice failed to demonstrate that the coverage definition of the law—the fact that the law applied to businesses “likely to be accessed by children”—rendered the law content-based and therefore subject to strict scrutiny. The court also found that the record did not demonstrate that the age estimation “facially violates the First Amendment at all, much less in a substantial majority of applications” and that NetChoice had not demonstrated that the law was not volitionally severable and could not remain after certain parts were struck down.

In contrast, the Ninth Circuit affirmed the lower courts’ findings that requirements governing “data use” that were in the “best interests of children” and prohibiting “material detriment,” as well as related design requirements, were unconstitutionally vague. The Ninth Circuit found that, especially at the edges, what was materially detrimental to a child was unclear — such as regarding “sleep loss, distraction, or hurt feelings” – and that there was insufficient guidance for businesses to follow to comply with such a requirement. Similarly, the court found there was no well-understood definition of “best interests of children” in the data privacy context. It rejected California’s attempts to analogize the concept to family law proceedings–where the best interests of a child are typically determined on a case-by-case basis, given the specific situation of a unique child.

The Ninth Circuit also reiterated the heavy burden facing NetChoice and others when bringing facial challenges—efforts to get an entire law declared unconstitutional in all applications—after Moody v. NetChoice. The precedent does not mean all legislation can be successfully defended, but it does raise the bar for those challenging such laws.

The case has now been sent back to the district court, where the parties will either continue to litigate or come to an agreement. With respect to this law in particular, both sides can, and did, claim some victory. But overall, it is those supporting, passing, and defending thoughtful children’s privacy and online safety laws who should have reason for optimism. The ruling shows that, going forward, online safety and the First Amendment can coexist

The Ninth Circuit’s roadmap for future child safety laws

The Ninth Circuit’s rulings offer important guidance not only for defending existing laws, but also for drafting future legislation that is more likely to survive judicial scrutiny while meaningfully addressing platform design harms.

That roadmap has four parts. First, laws defining covered businesses, including those “likely to be accessed by children,” should do so by referencing their audience composition and not their content, or by referencing design features. Second, if a law requires age estimation, it should ensure the process is privacy-protective and provides options for businesses and individuals. Third, lawmakers should distinguish between the use of age estimation for purposes of providing enhanced privacy protections or protective design features, versus for restricting access to content. And fourth, laws should either avoid vague terms like “best interests of children” in the digital context or define them clearly.

The importance of feature-based rules and focusing on the design of digital products is doubly important following other decisions, such as the jury verdict in K.G.M. v Meta et al., which found Meta and YouTube liable for negligence in their design of addictive platform features.

Thus, rather than closing the debate over platform regulation, the Ninth Circuit begins to define the potential legal contours within which child-protection laws can operate. These include:

Laws focused on sites “likely to be accessed by children” are not necessarily content-based. The decision makes clear that just because laws apply to businesses likely to be accessed by children, that does not render them content-based and subject to strict scrutiny. As the Ninth Circuit noted, laws that target speech based on its content are presumptively unconstitutional and must satisfy strict scrutiny. However, content-neutral laws need only meet intermediate scrutiny. The court found that whether a site is “likely to be accessed by children” may be determined by statutory factors that have nothing to do with the nature of posts or other materials on the site—such as audience composition, which, as the Ninth Circuit noted, “requires looking to data received by the business, not content the business publishes.”

Age estimation can be constitutional and may not trigger First Amendment scrutiny. In this case, the Ninth Circuit acknowledged that the purpose of age estimation affects the constitutional inquiry. Here, the purpose of age estimation was not to provide child-appropriate content, but rather to provide privacy and data protection. Further, as the Ninth Circuit noted, “The provision is clear that businesses that do not wish to conduct age estimation may publish any content they would like, as long as they default to data and privacy protections for all users.” The Ninth Circuit also pointed out that in Free Speech Coalition v. Paxton, the Supreme Court “said nothing about the effect of age estimation on First Amendment burdens generally, especially where age estimation is not required as a precondition to access content.” In fact, the Supreme Court “observed that ‘adults have no First Amendment right to avoid age verification.’”

The Ninth Circuit also noted the lack of a detailed factual record regarding age estimation, “including applications that do not prevent access to content or require data collection for compliance,” and found that “the district court erred by assuming that the only way covered businesses could comply with the provision is by ‘collecting privacy information that users may not wish to share’.” Instead, the Ninth Circuit recognized the State had put forth evidence of methods that “would require no additional data collection.”

Child privacy and safety laws vary widely and merit different constitutional consideration. In determining that NetChoice had not demonstrated the coverage definition was content-based, the Ninth Circuit explicitly distinguished the CAADC from laws that specifically targeted social media platforms. Many such laws also use age estimation to restrict access to content on social media. Those laws raise different constitutional questions than the Age Appropriate Design Code and other privacy and design feature laws. Regulation targeting design features, and addressing these tech platforms as the products, has taken on renewed importance following two trials that held social media companies accountable—the March 24 New Mexico Attorney General’s successful suit against Meta for misleading consumers about its products’ safety and endangering kids, and the March 25 social media addiction lawsuit jury verdict in California.

Terms like “material detriment” and “best interests of children” may benefit from further explanation. Though the Ninth Circuit did not make this point, the US has yet to ratify the United Nations Convention on the Rights of the Child. As such, while the UN has fleshed out these definitions in the digital context, that doesn’t give them clear meaning in US law. Additionally, the concept of what is in a child’s best interest, or what may be of material detriment to them, may be subject to significant debate. Indeed, laws passed since the AADC, including those in Vermont, South Carolina, and Nebraska, have already tightened up their language.

Once a law is passed, facial challenges “must clear a high bar.” Lastly, the Ninth Circuit noted that the Supreme Court “has been clear” and that it had “since emphasized” the high bar for facial challenges. It reiterated that in a facial challenge, a trade group like NetChoice cannot only focus on a law’s effects on its members’ platforms. This decision reiterates that the circuit will be exacting in developing a factual record, and serves as another warning to challengers to fully develop their case.

Ultimately, the Ninth’s Circuit CAADC II decision bodes well for the future of thoughtful privacy and safety laws that can make the internet a safer and healthier place for all kids.

Authors

Ariel Fox Johnson
Ariel Fox Johnson is the founder of Digital Smarts Law & Policy, LLC. She is also a senior advisor to Common Sense Media and an adjunct law professor at Cleveland State University’s College of Law. Ariel previously worked as senior privacy counsel at Zoom and senior global policy counsel at Common S...

Related

Federal Appeals Court Narrows Injunction of California Age-Appropriate Design CodeAugust 16, 2024

Topics