Trials Probe Tech Companies' Responsibility for Sexual Assaults and Abuse
Madeline Batt / Mar 10, 2026Madeline Batt is the Legal Fellow for the Tech Justice Law Project.

New Mexico Attorney General Raúl Torrez speaks during a rally to protect kids online on Capitol Hill in Washington, Wednesday, Jan. 31, 2024. (AP Photo/Jose Luis Magana)
The Tech Litigation Roundup spotlights notable lawsuits and court decisions across a variety of tech-and-law issues.
The first bellwether social media addiction trial in Los Angeles drew significant attention in February. But two other major trials in cases against Uber and Meta could have similarly important implications for tech accountability, particularly for survivors of sexual violence.
On Feb. 5, an Arizona jury reached a verdict for the plaintiff in the first bellwether trial of a multi-district litigation against Uber related to passenger sexual assaults. Uber argued that it was not responsible for sexual assaults committed by its drivers, whom it classifies as independent contractors. A jury disagreed, awarding $8.5 million in compensatory damages to Jaylynn Dean, who said she was raped by her Uber driver in November 2023. The jury’s decision and the theories of liability they rejected will shape future litigation against rideshare apps and inform broader debates about liability in tech litigation involving third-party harm.
Meanwhile, in New Mexico, the first standalone trial by state prosecutors against Meta is underway. The lawsuit by the New Mexico Attorney General echoes the LA social media addiction case by foregrounding the youth mental health harms allegedly caused by platforms’ addictive design. But the New Mexico case goes further, alleging that Meta’s apps created a “breeding ground” for child sexual exploitation while publicly misrepresenting its platforms as safe.
Uber verdict is a win for survivors, despite result on negligence
Uber has argued that, because it categorizes its drivers as independent contractors, it cannot be held liable when those drivers sexually assault or otherwise harm passengers who book rides through the company’s app. In the first jury trial of federal multi-district litigation brought by approximately 3,000 passengers who allege they were sexually assaulted by Uber drivers, the jury decided against the company.
They concluded that Uber was vicariously liable for a rape committed by its driver on a theory of “apparent agency.” Because the lead plaintiff, Jaylynn Dean, interfaced with Uber throughout her ride and reasonably believed the driver was representing Uber, the company was liable for the driver’s wrongdoing. A prior passenger sexual assault trial against Uber, which was litigated in state court, reached the opposite conclusion
The new verdict impacts the thousands of other cases pending in the multi-district litigation against Uber, where the company now faces a credible risk of tens of billions in losses at trial. The viability of an apparent agency theory of liability also poses a threat beyond Uber to other technology companies that have developed interfaces for engaging with the gig economy. The jury’s decision indicates that gig work platforms’ business model, which often insulates companies from costs associated with the protections workers would receive if they were classified as employees, will not necessarily insulate them from liability for harm to customers.
The jury, however, did not find Uber liable under any direct negligence theories. The jurors did not think Uber’s safety policies were negligent, nor that the app’s safety features were defectively designed. This is notable in light of the evidence presented at trial about Uber’s knowledge of sexual assault risks, both generally and to Dean in particular. Internal documents and executive testimony showed Uber’s awareness that women riding alone at night faced a heightened risk of sexual assault, but did not publicly disclose these dangers.
Trial evidence also showed that Uber’s internal safety assessment algorithm rated Dean’s ride a 0.81 out of 1, indicating an elevated risk for sexual assault or another serious safety incident, but the company took no action to notify Dean or assign a different driver. The finding contrasts with the state court trial, where the jury did find Uber negligent, but nonetheless concluded that the company was not responsible because the negligence was not a substantial factor leading to the plaintiff’s assault.
Other lawsuits involving a range of tech products have raised negligence claims that similarly rely on evidence of corporate knowledge of their products’ risks to users’ safety. The LA social media addiction case and others in the coordinated proceeding, for example, have produced internal documents showing social media companies were aware of mental health harms associated with their platforms. Plaintiffs are using these documents to argue the companies were negligent.
Because negligence claims are highly fact-dependent, the jury’s finding that Uber was not negligent in Dean’s case is not necessarily predictive of how juries will respond in other cases. Factors such as third-party criminal conduct, the prevalence of harm, and the feasibility of preventive measures could all influence how jurors assess whether a company was negligent. (As the prior state jury’s negligence finding indicates, different juries can also reach different decisions on similar facts.)
The bellwether social media addiction trial happening may offer additional insight into how juries evaluate a tech company’s duty of care when it knows its product is associated with serious risks to users.
New Mexico trial targets online child sexual abuse through consumer protection and nuisance laws
As survivors secured a major victory against Uber in Arizona, another jury trial began in New Mexico for a case involving Meta’s alleged role in child sexual exploitation on its platforms. The lawsuit is one of many brought by state Attorneys General against social media companies for harm to youth, but it is the first standalone state-led case to go to trial against Meta. Following a months-long undercover investigation by the New Mexico Attorney General’s office, the case also involves extensive allegations about how trafficking and other forms of child sexual abuse flourish on Facebook and Instagram.
The complaint faults Meta both for failing to implement safety measures and for affirmatively facilitating exploitation via its recommendation algorithms. Some of the safety measures that the Attorney General suggests Meta should have implemented––such as eliminating end-to-end encryption from its products, including Whatsapp––implicate countervailing privacy concerns. But the complaint also contains allegations about how Meta’s recommendation algorithm itself promotes trafficking-related content, connects children to abusers, and assists users searching for child sexual abuse material in circumventing blocked content.
Recent reporting from the Atlantic indicates that Meta staff internally acknowledged their algorithms’ role in worsening sex abuse on its platforms. The complaint links these harms to Meta’s alleged targeting of child users through addictive design features. It ultimately argues that Meta has created a public nuisance and engaged in “unfair and/or unconscionable trade practices,” because its actions are “offensive to public policy” against human trafficking and child sexual exploitation.
In another cause of action, the Attorney General leverages consumer protection law to target the company’s misrepresentations around the safety of its platform. The complaint cites statements by executives, as well as Meta’s use of an allegedly misleading metric it calls “prevalence” to publicly downplay the rate of unsafe or offensive content on its platform. According to the complaint, the prevalence metric only covers reported content that is explicitly found to violate a Meta rule, even as Meta allegedly knew that the vast majority of abusive content goes unreported, failed to maintain adequate staff to determine whether reported content violated its rules, and––in the unusual case where content was reported and reviewed––repeatedly concluded that sexually explicit content involving children did not violate its standards.
With the misrepresentation cause of action, New Mexico could win even if the jury is unwilling to hold Meta responsible for facilitating child sexual abuse on its platforms, because jurors could still reach the separate conclusion that the company has actively misrepresented how much abuse is occurring.
If the Attorney General is successful, Meta could face not only significant financial penalties but also injunctive relief requiring significant changes to its platform and policies. The targeting algorithms implicated in both promoting child sexual abuse content and mental health harm to youth, which the complaint also discusses, are a core part of Meta’s current business model.
Sexual violence cases involving tech companies are increasingly common, but trials have so far been rare. Several high-profile actions by survivors against dating apps have been dismissed under Section 230, a statute that immunizes online platforms against lawsuits for third-party content. Nevertheless, litigators are continuing to refine theories of liability and file cases. With both Uber and Meta on trial, the circumstances in which tech companies will be held responsible for sexual violence and other harms is continuing to take shape.
Other tech litigation news
- K.G.M. Bellwether Trial: The first bellwether trial of the California coordinated proceeding on social media addiction was the biggest story in tech litigation this month. CEO Mark Zuckerberg was cross-examined at trial.
- DHS Sued for Abuse of Surveillance Tech: Legal observers in Maine filed a class action lawsuit against the Department of Homeland Security, alleging the agency used surveillance technology to intimidate and punish them for exercising their First Amendment right to observe immigration agents.
- Utah’s Age Verification Law Challenged: The Computer and Communications Industry Association sued to invalidate Utah’s age verification law as a violation of the First Amendment and Commerce Clause.
- Radio Host Sues Over Alleged AI Impersonation: NPR radio host David Greene sued Google for using an artificially generated voice on NotebookLM, its podcast-generating AI tool, that he alleges clearly resembles his own. In statements to the press, Greene emphasized that the harm of hearing his voice used without his consent––including for the spread of disinformation––felt deeper and more personal than a loss of financial opportunity. (TJLP’s policy work on digital likeness seeks to expand legal protections against AI impersonation to reflect these broader, non-commercial harms.)
- NAACP Threatens Lawsuit Over xAI Data Center: The NAACP sent xAI a Notice of Intent to Sue, alleging that xAI is violating the Clean Air Act by operating methane gas turbines without permits to power its Colossus 2 data center in Mississippi. The notice letter is a requirement prior to initiating suit under the Clean Air Act.
- Texas Sues Snap: Texas Attorney General Ken Paxton sued Snap, the developer of the app Snapchat, alleging that the company violated state consumer protection laws via addictive design features and failure to warn users and parents about inappropriate content.
- West Virginia Sues Apple: West Virginia Attorney General John McCuskey sued Apple for failing to detect and report child sexual abuse material stored and shared on iCloud.
Authors
