Thursday, February 19, 2026
Home WORLD NEWS Zuckerberg: Meta is moving away from maximizing user screen time

Zuckerberg: Meta is moving away from maximizing user screen time

0
Meta no longer aiming to boost screen time - Zuckerberg
Meta may have to pay damages if it loses the case, and the verdict could erode Big Tech's longstanding legal defence against claims of user harm

In the Courtroom’s Bright Light: Zuckerberg, Instagram and a Generation on Trial

Los Angeles, mid-morning: sunlight slants through the high glass of the courthouse and paints the mahogany benches with a band of hot gold. Cameras click. Lawyers shuffle papers. And, at the center of it all, Mark Zuckerberg sits under oath—one of the most recognizable figures of the internet age, answering questions about the very apps that shape how young people see themselves.

This is not Washington. This is a jury trial with a plaintiff whose childhood, she says, was reshaped by social media. It’s a case that could ripple into boardrooms and classrooms around the globe. It’s part legal dispute, part morality play, and part public reckoning with the attention economy that has driven tech companies for the past decade.

The Moment on the Stand

When asked about his 2024 testimony to Congress—where he told lawmakers that Meta did not instruct teams to maximize users’ time—Zuckerberg was direct.

“If you are trying to say my testimony was not accurate, I strongly disagree with that,” he said in court, according to observers in the room. He didn’t simply deny. He sought to explain that company priorities have evolved, that internal goals from years past do not define today’s approach.

But the plaintiff’s lawyer, Mark Lanier, produced emails from 2014 and 2015 in which Zuckerberg appears to lay out ambitions for lifting user time on the platform by “double-digit percentage points.” The exchange between past directives and present testimony created an electric tension in the courtroom—one that no doubt swells in the minds of jurors weighing intent against outcomes.

Human Faces Behind the Headline

The woman at the center of the lawsuit says she began using Instagram as a child. She alleges that design choices and company priorities accelerated a slide into anxiety, depression and suicidal thoughts. Her legal team argues the companies profited by keeping young people engaged while knowing — or ignoring — the harms that could follow.

Outside the courtroom, conversations with parents and teens give texture to the legal argument. “My daughter would scroll for hours and then cry about herself,” said Maria Alvarez, a mother of two in Echo Park. “I don’t know if the app made it worse, but I know it changed our nights.”

“It’s engineered,” offered a former product designer who left a major social platform. “Features are optimized to trigger emotion, and emotion is sticky. That’s how engagement metrics rise.”

Not everyone sees social apps as purely harmful. “Instagram was how I found my voice in high school,” said Jonah, 22, who grew up in suburban Ohio. “It was also where I learned to edit, to create. The platforms are complicated tools.”

Evidence, Internal Research, and the Public Record

Investigative reports over recent years have revealed internal documents from Meta showing staff awareness of risks—particularly around teens and body image. One finding that reverberated last October suggested that teens who reported Instagram made them feel worse about their bodies were more exposed to “eating disorder–adjacent content” than peers who did not express that distress.

Meta counters that it has implemented safety features and points to independent findings—citing a panel from the U.S. National Academies that said research has yet to establish a definitive causal link between social media use and changes in children’s mental health. The company also notes that many young people report positive experiences online: community, identity, creative outlets.

These competing truths—documented harm and documented benefit—make the courtroom a difficult place for simple answers. Jurors are asked to parse intentions, product roadmaps from a decade ago, and the messy intersections of childhood, technology and mental health.

Why This Case Matters Beyond One Plaintiff

What’s at stake is not only potential damages for this plaintiff but the future contours of tech accountability. If juries begin finding platforms liable for youth harms tied to product design, the legal landscape that has protected social platforms for years could shift.

Already, governments are moving. Australia has limited access for users under 16 on some platforms; in the U.S., Florida has put restrictions on under-14 access that tech trade groups are challenging in court. Across Europe—countries such as Ireland, France and Spain have debated tighter rules. Families, school districts and states in the U.S. have filed thousands of lawsuits alleging that tech companies contributed to a youth mental health crisis.

Consider the scale: Pew Research Center reported in 2018 that 95% of U.S. teens had access to a smartphone and 45% said they were online “almost constantly.” The World Health Organization has long flagged mental health among adolescents as a global priority, with suicide among the leading causes of death in young people. Against that backdrop, questions about design, addiction, and regulation are not merely legal—they are social and ethical.

Possible Outcomes and Broader Ripples

  • A legal victory for the plaintiff could prompt sweeping design changes and open the door to more negligence claims.

  • A ruling for Meta could reinforce the company’s defense and leave regulation to legislators rather than juries.

  • Either way, the trial amplifies a global conversation about how societies balance innovation with safety for children.

Reading Between the Lines: A Cultural Frontline

This trial lands in a wider cultural moment. Tech companies have built empires on attention, and attention can be both currency and casualty. Teenagers today are growing up with a mirror held up by algorithms—mirrors that can distort and magnify every insecurity. At the same time, social apps can connect a bullied teen with a supportive community across town or across the globe.

“We are asking a jury to decide how to weigh a company’s internal incentives against a person’s life,” said a legal scholar observing the case. “That’s heavy civic work.”

So, what should we make of all this? Is the answer stricter laws, smarter design, parental guidance, digital literacy in schools, or some combination? Perhaps the trial’s real contribution will be to sharpen the public debate, forcing designers, policymakers and families to confront uncomfortable trade-offs.

Closing Questions

As the case continues, I find myself asking: can platforms be engineered to protect without stifling creativity or connection? Can we create incentives that prize wellbeing alongside engagement? And what responsibility do we, as citizens and parents, bear in the design of our children’s digital worlds?

In the courthouse hallway, a teacher from South L.A. paused on her way out and said, “We can’t rewind the last ten years. But we can decide what the next ten look like.” That sentence—simple, stubborn, true—lingers longer than any headline. It is, perhaps, the point of this trial: not just to assign blame, but to illuminate a path forward.