Santa Fe’s Sun, a Jury’s Hammer, and a $375 Million Question for Big Tech
On a bright morning in Santa Fe, where adobe walls warm under a desert sky and courthouse steps have seen protests and prayers alike, the jury returned a verdict that will echo beyond New Mexico’s borders.
After less than a day of deliberation, 12 citizens found that Meta Platforms—the company behind Facebook, Instagram and WhatsApp—broke state consumer protection law. The jury tallied 75,000 individual violations at $5,000 a piece, producing a civil penalty of $375 million.
It was not just the size of the number that mattered. It was the symbolism. For the first time, a jury in the United States has concluded that a major social media company knowingly misled users about the safety of its services and, in doing so, created openings exploited by predators. For a city known for its artists and storytellers, the courtroom became a stage for a story about children, technology and accountability.
The verdict and what led to it
The trial—six weeks of testimony, documents and sometimes harrowing detail—was spearheaded by New Mexico Attorney General Raúl Torrez, a Democrat and former prosecutor who framed the case as protecting the state’s children from corporate indifference. The state had urged the jury to award more than $2 billion in damages; the jury settled on the $375 million number after finding that Meta engaged in unfair or deceptive trade practices and acted unconscionably toward New Mexico residents.
“This is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety,” Torrez said after the verdict, striking a tone part triumph, part admonition. Meta quickly said it disagreed and would appeal. “We respectfully disagree with the verdict and will appeal,” a company spokesperson said, reiterating that Meta believes it works to keep people safe and that identifying bad actors is difficult.
At trial the state presented a covert operation as its opening salvo. In 2023, investigators created accounts on Facebook and Instagram posing as children under 14. The accounts were promptly hit with sexually explicit material and messages from adults seeking contact—evidence, prosecutors say, that predators had virtually unfettered access. Those interactions led to criminal charges against several individuals.
Prosecutors also leaned on internal company documents—emails, memos, research summaries—from which they argued Meta understood the risks its products posed to minors. Jurors heard how features like infinite scroll and auto-play videos were engineered to maximize engagement, keeping young users glued to screens and susceptible to harm. Meta’s lawyers countered: the company provides robust disclosures and safety tools and cannot be held liable for third-party content that flows across its platforms.
Voices from the courthouse and the community
Outside the courthouse, reactions were raw and varied. “I brought my daughter here today because this felt like a decision about the kind of world she’ll grow up in,” said Elena Ruiz, a high school teacher from Albuquerque. “We’ve all seen how fast things can spiral online. This felt like holding someone responsible.”
Not everyone cheered. Jacob Meyers, who works in tech support and worries about overreach, told me, “I get that predators exist, and we need better tools. But I also worry about a lawsuit culture that stifles innovation and blames systems when the problem is human behavior.”
Experts who followed the case say the New Mexico decision may spark a shift in how courts think about platform responsibility. “This isn’t just a local ruling,” said Dr. Rebecca Lin, a psychologist who studies adolescent behavior and tech. “It signals to other state AGs and plaintiffs that platform design—how algorithms nudge attention—can be scrutinized under consumer protection laws.”
Where this fits in a larger pattern
The Santa Fe ruling arrives amid a tidal wave of litigation and scrutiny. Meta faces thousands of lawsuits nationwide alleging the company deliberately designed apps to be addictive to youths, contributing to anxiety, depression and self-harm. Some lawsuits seek damages in the tens of billions, according to Meta’s own regulatory filings. In California, another jury began weighing related addiction claims, and congressional hearings—sparked in part by a 2021 whistleblower disclosure—have painted a picture of internal research showing potential harms to teens.
Legal battlegrounds are also wrestling with constitutional and statutory shields. Meta has argued the First Amendment and Section 230 of the Communications Decency Act protect the company from liability for user-generated content. In New Mexico, a judge rejected the Section 230 defense, clearing the path for a jury to consider the claims. Still, legal appeals are likely—and expected.
What happens next
For now, the jury verdict is only one chapter. In May, Judge Bryan Biedscheid will hear a bench trial—no jury—on the state’s separate request to declare Meta a public nuisance and to compel changes to platform design. Attorney General Torrez has said he will ask the court for injunctive remedies: effective age verification, better tools to remove predators, and other structural fixes aimed at protecting children statewide.
“Financial penalties are important, but we want long-term changes,” Torrez told reporters. “We want the design choices that make children vulnerable to be unmade.”
- Verdict: Meta found to have violated New Mexico consumer protection law
- Penalty: 75,000 violations @ $5,000 = $375 million
- Next steps: Appeal expected; May bench trial on public-nuisance and remedies
Why people beyond New Mexico should care
The courtroom in Santa Fe was small, but the issues are global. Every app that measures success in minutes spent and clicks generated faces the same tension between engagement and safety. Countries from the UK to India are wrestling with platform regulation; parents everywhere are asking: who protects our children when screens are their classrooms, marketplaces and social commons?
And then there is the cultural dimension. In New Mexico, a state with deep family networks and multigenerational households, concerns about online predators are not abstract. “We teach our kids to be careful crossing the street,” said Maria Gomez, a grandmother who watches her grandchildren after school. “But the danger online feels unstoppable sometimes. We need someone to build a fence.”
That image—a fence, a safety net—is powerful. It invites a broader question: What kinds of fences do we want technology companies to build? Regulations? Transparent algorithm audits? Robust age verification? Or, perhaps, a cultural shift toward less attention economy-driven design?
The answer will shape the lives of millions of young people. It will determine whether platforms continue to be treated primarily as neutral conduits of third-party speech, or whether the architecture of those platforms will be judged for the risks it creates.
Closing thoughts
Back on the Santa Fe plaza the day after the verdict, an artist painted banners near the courthouse, images of children with smartphones turned to stars. People walked by, some nodding, some hurried. The case is far from over. Appeals will wind through courts. Trials will continue. And whether you live in a small New Mexico town or a megacity thousands of miles away, you are now part of an unfolding public conversation about safety, corporate responsibility and the kind of digital world we want to inhabit.
So ask yourself—what do you want the next generation to inherit from the internet? A playground with a fence and adult supervisors. Or a wild park with hidden traps? The answer may determine not only the next legal battle, but the future of childhood itself.










