Meta Verdict Sharpens EdTech Litigation Against Google’s Chromebooks
Danai Nhando / Apr 16, 2026
New Mexico Attorney General Raúl Torrez speaks during a news conference to call on Congress to pass legislation to protect kids online on Capitol Hill in Washington, DC on Tuesday, April 14. (AP Photo/Jose Luis Magana)
For nearly three decades, technology companies have operated under a reliable legal premise: that Section 230 immunity, combined with the sheer complexity of how platforms are built, made it effectively impossible for any plaintiff to hold them accountable at trial for harming children. Last month, a Santa Fe jury dismantled that premise by issuing $375 million in civil penalties against the tech giant for failing to protect kids against child exploitation.
The verdict against Meta was returned on both claims the state of New Mexico brought under its Unfair Practices Act: that Meta made false or misleading statements about the safety of its platforms, and that it engaged in unconscionable trade practices that exploited the vulnerability and inexperience of children. It is the first time any state enforcer has prevailed at trial against a major technology company over such claims.
The theory that won at trial and its link to education technology
To understand why that verdict matters for child safety rights in edTech, one has to understand precisely what it established and what it did not. The Santa Fe jury was not asked whether Meta bears responsibility for content that users posted on its platforms. That theory would have died on Section 230 grounds before it reached a courtroom. Instead, state Attorney General Raúl Torrez built the case entirely around Meta's own first-party conduct: its design choices, its internal safety overrides and its public misrepresentations about platform safety.
Section 230 immunizes platforms for what third parties say on them. It should not immunize platforms for what they themselves say, nor for the design decisions they themselves make. That gap, combined with New Mexico's Unfair Practices Act, was the legal aperture through which a jury drove the $375 million judgment.
The evidence now in the public record shows how Meta's own engineers and executives repeatedly warned about child sexual abuse material proliferating on its platforms and about algorithms amplifying harmful content to drive engagement, as well as how the company overrode those warnings for commercial reasons. It reveals a persistent pattern of documented risk that was subordinate to growth metrics at the expense of child safety.
A day after the New Mexico verdict, a Los Angeles jury found Meta and Google-owned YouTube negligent in platform design and awarded $6 million in total damages in a bellwether personal injury case involving youth social media addiction.
Two juries in two jurisdictions, sided with similar arguments that harm flowed from the companies' own design choices, not from user content, delivering verdicts with far reaching consequences.
No company should be reading those verdicts more carefully than Google. 93% of United States school districts planned to purchase Chromebooks in 2025, which already command 60% of the global education device market. Two active cases filed by the EdTech Law Center against Google LLC tell a sharply different but structurally resonant story.
Z.G. v. Google LLC, filed in California state court in June, centers on a ten-year-old plaintiff who was sexually victimized by an “anonymous predator” through Discord, a platform she accessed via a school-issued Chromebook. The complaint alleges that Google shipped devices to schools with “virtually unrestricted” internet access by default, effectively giving an elementary-school child unsupervised exposure to harm the company could have prevented.
M.C. v. Google LLC, filed by a Utah family anonymously on behalf of a minor in California federal court in October 2025, alleges that the child developed a pornography addiction beginning after accessing pornographic content through a school Chromebook, and that Google's deliberate design choices driven by its data-monetization model made that outcome foreseeable and preventable. Both complaints allege that Google marketed these devices to school districts as safe educational tools while deliberately designing them without meaningful content restrictions.
Those cases likely just got materially stronger because of the Meta verdict, and the reasons why expose the structural deception at the heart of the EdTech industry's relationship with children.
The legal architecture of those complaints is comparable to the framework that just won a trial in Santa Fe: design-defect liability premised on the defendant's own choices and misrepresentations, structured to avoid Section 230. The difference is not the theory, but the population of children it concerns — and that difference makes the Google cases compelling.
The children in the Chromebook cases were students, required by their schools to use the products as a condition of participation in their own education. In my view, children constitute Big Tech's most structurally captive population: legally compelled to attend school, institutionally required to adopt whatever technologies their institutions procure and developmentally incapable of meaningfully consenting to the behavioral data extraction those platforms perform.
Google's school-device program works commercially precisely because that captivity is compulsory. Schools mandate the product, students have no meaningful alternative and Google gains access to a captive population generating data across every school day and into the evening hours.
In federal court last October, EdTech Law Center attorney Julie Liddell argued: "If teachers were doing what Google is doing, namely surveilling children all day while they are at school, even when they're at home on devices, and logging every single thing they're doing … I don't think we would have a problem saying that's a problem."
The coming reckoning
The Santa Fe verdict did not end with the jury's $375 million award. New Mexico's Torrez has said he intends to seek court-mandated changes to Meta's platforms, specifically to implement real age verification, algorithmic changes, an independent monitor and a fundamental restructuring of how Meta does business with minors in New Mexico.
That ambition points toward structural injunctive relief, the approach that ultimately proved more consequential than monetary damages in past litigation against the tobacco industry, where the real legacy was not the settlement checks but the forced disclosure of internal research, the advertising restrictions and the mandated redesign of how an industry was permitted to reach children. A damages award compensates the children who were already harmed. An injunction protects the children who are sitting in classrooms right now.
The EdTech Law Center's demands in the Google cases could equally encompass that model. The most concrete form could be an injunction requiring that school-issued Chromebooks default to a locked minor-protective configuration, blocking behavioral advertising and unauthorized data collection.
What stands between that outcome and the present moment is discovery. The question now is whether the courts evaluating the Chromebook cases will permit the factual investigation that would allow a California jury to get answers, under oath and on the record, on what Google knew about the risks its school devices posed to children, when it knew it and what it chose to do instead. The Meta trials established that internal documents exist at these companies, including engineering memos, safety team warnings and executive overrides, that tell a story the public-facing product never did. Google's internal record on Chromebook safety architecture has not yet been fully compelled into a courtroom.
The structural irony at the center of both Google cases is precisely the kind of contradiction that juries, it turns out, are very capable of recognizing. A company that markets Chromebooks as simple, powerful, and secure for US schoolchildren may have built that technology without adequate child safety as any part of the design equation as a calculation, not an oversight.
The students in school districts using Chromebooks have had no say in the devices they use, no ability to evaluate the architecture of the device placed in their hands and no exit option. What the New Mexico verdict demonstrated is that corporate language about safety and the reality of how products are actually built are two different things, and children have been the unwitting pawns of that difference long enough.
Authors
