Juries Are Delivering Justice and Holding Big Tech Accountable. Will Congress?
Jennie DeSerio, Lori Schott / Apr 17, 2026
Attorney Mark Lanier speaks during a news conference after the verdict in a landmark trial over whether social media platforms deliberately addict and harm children at Los Angeles Superior Court, Wednesday, March 25, 2026, in Los Angeles. (AP Photo/William Liang)
Before the trials and the headlines, there were two children: Annalee, a beautiful farm girl from Colorado and Mason, a much-loved athlete from Arkansas. Annalee loved animals and rodeo life; she dreamed of working for the Humane Society. She organized community blood drives, starred in school plays, and quietly showed up for people when it mattered. Mason was the boy who ran late to class because a friend needed him. He sat with a terrified kindergartner during a tornado warning, holding her hand and telling jokes until she felt safe. Empathy and kindness were these children’s hallmarks, not the causes of their deaths, but the traits that made them more vulnerable to systems engineered to amplify emotion, reward obsessive engagement, and monetize attention.
After recent jury verdicts in California and New Mexico, the courts sent a powerful message to Silicon Valley: tech platforms can be held accountable for how their design features and business practices harm users, particularly children. For our children, that message came too late.
After our children died, we tried everything we could to put one foot in front of the other and navigate our new normal. The landmark trials put us through it all again. We listened to Big Tech and their lawyers tell the world that our children and we are to blame — a profound act of cruelty that weaponizes the already unbearable agony that we know all too well. Juries in two separate courts looked at the evidence and rejected that narrative.
A New Mexico jury ordered Meta to pay $375 million for misleading users about platform safety and enabling child sexual exploitation. The very next day, a Los Angeles jury found Meta and YouTube negligent, concluding that their apps were deliberately designed to be addictive, and that executives knew their platforms were harming young people and failed to protect them.
Internal documents unsealed before and during the California trial showed that tech executives designed their platforms to be addictive and that they were aware of the negative effects these platforms were having on children. In one example, YouTube's parent company compared certain product features to "slot machines," while Meta employees said, “the company’s tactics reminded them of tobacco companies.” Internal memos from Meta revealed that 11-year-olds were four times as likely to keep returning to Instagram as competing apps – even though the platform is officially restricted to users aged 13 and older. Another document showed executives discussing the company’s goal to "win big with teens." Jurors saw all of it, and they believed it.
Not only do we have to live with the trauma of losing our children, but we also have to fight back against vicious attacks on our characters, parenting styles, and even love for our children. This cruel narrative couldn't be further from the truth. We were present for our children — we noticed changes, asked questions, and, like so many other parents, we were told these platforms were safe. In fact, unsealed documents show that Meta's own internal research found that parents were powerless to stop the addiction.
These verdicts are not the end of the road for Meta, or for the industry. More trials are coming, more families are stepping forward, and the pressure on these companies is only growing. As these cases move forward, courts will continue to show that these companies refused to make changes on their own.
Courts have exposed the extent of this wrongdoing. Juries have addressed individual cases and will continue to do so. Now Congress needs to act to prevent more harm to our children.
Instead, the US House of Representatives is advancing the Kids Internet and Digital Safety Act (KIDS Act), a piece of legislation that does more to shield tech companies than to safeguard children. It would explicitly state that social media platforms owe no duty of care to their users, absolving Big Tech of responsibility when children are put at risk. The KIDS Act would allow tech companies to write their own rules, deciding which policies and practices to follow, and shielding them from current and future state laws that enforce a higher standard. By preempting state laws, the bill would weaken existing online safety protections and undermine parents’ and state governments’ ability to hold social media companies accountable in court.
Our roles as parents didn't end when we lost our children. We'll continue to fight for our kids and work tirelessly to make sure that their stories are heard, tech is held accountable, and no other family has to suffer through the same pain.
Despite families asking lawmakers for years to address online privacy and safety issues for children, the House advanced the KIDS Act, which does the opposite. We are asking Congress to pass legislation that requires platforms to protect children by design and that preserves the legal rights of families like ours. We can prevent the deaths of more children.