Meta Was Finally Held Accountable For Harming Teens. Now What?

Meta lost two landmark lawsuits for harming teens. Here's what the court rulings mean for social media, child safety, and the future of Big Tech.
Matilda

Meta Held Liable for Teen Harm — What Happens Next?

For the first time in history, Meta has been held legally responsible in court for endangering the safety and mental health of children and teenagers. Two back-to-back court losses in a single week have cracked open a legal door that could reshape how social media companies build their products — and face consequences for the damage those products cause.

Meta Was Finally Held Accountable For Harming Teens. Now What?
Credit: NurPhoto / Getty Images

Two Landmark Rulings That Changed Everything

The first blow came when a jury found Meta liable for violating New Mexico's Unfair Practices Act after a six-week trial. The verdict ordered Meta to pay a maximum fine of $5,000 per violation, adding up to a total of $375 million. That alone was historic.

The very next day, a Los Angeles jury delivered a second verdict — finding Meta 70% liable and another platform 30% liable for the emotional and psychological distress suffered by a 20-year-old plaintiff identified only as K.G.M. The combined fine in that case came to $6 million. Two other platforms had already settled before the trial even began.

What made both cases groundbreaking was not just the verdicts themselves. It was what they were actually about.

It Was Never About the Content — It Was About the Design

Social media platforms have long been shielded from lawsuits over what users post on their platforms. That legal protection has existed for decades and has largely protected Big Tech from accountability. But these cases took a completely different approach.

Instead of targeting posts or comments, the legal teams went after the platforms themselves — specifically, the design features engineered to keep users, including children, hooked. Features like endless scrolling, round-the-clock push notifications, and algorithmic nudges designed to maximize time spent on the app were at the center of both trials.

Attorneys drew a direct comparison to the legal strategy once used against tobacco companies. Rather than arguing about what content appeared on these platforms, they argued that the architecture of the apps was itself the harm. That argument worked.

Internal Documents Revealed a Disturbing Pattern

As the litigation unfolded, a wave of internal company documents became public — and what they showed was damning.

A 2019 internal study, based on 24 one-on-one interviews with users whose behavior had been flagged as problematic, reached a stark conclusion. The report stated plainly that the platform's impact on user well-being was negative. That study covered roughly 12.5% of users — a significant slice of the total user base.

Other documents revealed that senior leadership, including the company's CEO and the head of Instagram, had discussed prioritizing teen engagement metrics. One employee email referenced the goal of encouraging teens to sneak looks at their phones during chemistry class — complete with a smiley face. Another internal email from a senior product executive acknowledged that no teenager wakes up wanting to open Instagram compulsively, but admitted that was precisely what the product teams were designed to achieve.

Perhaps most startling was an internal note suggesting that for a live video feature to succeed with teens, the strategy would need to focus on keeping parents and teachers in the dark.

A Former Insider Speaks Out

Not everyone inside the company stayed silent. A former Director of Product Marketing who spent 15 years at the company — and who is currently in a separate legal dispute with Meta over alleged workplace discrimination — said the newly surfaced evidence matched exactly what she witnessed during her time there.

She had personally led the rollout of a virtual reality social platform to teenagers. She said she raised alarms internally about the absence of effective content moderation tools in that environment, but her concerns were dismissed and went unaddressed.

Her firsthand account adds a human face to what the documents describe in cold, corporate language.

What Meta Is Saying Now

Meta has pushed back on both verdicts and announced plans to appeal. In a statement, the company argued that reducing the complex issue of teen mental health to a single cause ignores the broader challenges young people face today and dismisses the real value that digital communities provide in terms of connection and belonging.

A company spokesperson also pointed to Instagram Teen Accounts, a suite of safety features introduced in 2024. Under this system, teen accounts default to private, limit who can interact with them, and receive reminders to leave the app after 60 minutes. For users under 16, changing that time limit requires parental permission.

The spokesperson also noted that many of the internal documents being cited are nearly a decade old and that the company no longer sets internal goals around teen time spent on the platform.

The Floodgates Are Now Open

The legal fallout from these two cases is likely just beginning. Thousands of similar lawsuits are currently pending. Attorneys general from 40 states have already filed cases against Meta modeled closely on the New Mexico lawsuit.

The financial stakes are significant. While $6 million is a small number for a company of Meta's size, that figure multiplied across thousands of individual cases could represent enormous exposure. Legal experts have noted that the precedent set by these verdicts gives future plaintiffs a roadmap — and that roadmap has now been proven to work.

Washington Is Paying Attention — But the Picture Is Complicated

The issue of children's online safety has been building in Washington for years. A major turning point came in 2021 when a whistleblower leaked internal documents showing the company was aware that its photo-sharing platform was negatively affecting the mental health of teenage girls. That moment accelerated congressional interest significantly.

Since then, lawmakers have introduced multiple pieces of legislation aimed at protecting minors online. However, not all of those efforts have earned uniform praise — even from people who have personally lobbied for stronger protections.

The former Meta executive who raised internal concerns about teen safety later became an advocate for one of the most prominent children's online safety bills moving through Congress. But as that legislation evolved, she changed her position. She now urges lawmakers to vote against the current version, specifically because of language in the bill that would override state-level regulations and effectively close courthouse doors to families, school districts, and states attempting to hold platforms accountable — the exact kind of legal action that just produced historic results in New Mexico and Los Angeles.

Privacy advocates have raised their own concerns, arguing that some legislative proposals, while framed around protecting children, would in practice expand surveillance of adults and restrict protected speech far beyond what any legitimate child safety goal would require.

Where Do We Go From Here?

The conversation about what to do next is just beginning — and the people closest to the issue say the answer will not be simple.

The former insider turned advocate put it plainly: the solutions that actually protect children will need to be complex, nuanced, and honest about competing priorities. Platitudes and political theater, she argued, have dominated the conversation for too long. What is needed now is a genuine reckoning — from the industry, from lawmakers, and from society — about what it means to build products aimed at children without regard for the consequences.

These two court verdicts will not fix everything. But they have, for the first time, established that a social media company can be found legally responsible not for what people say on its platform, but for how the platform itself was built. That changes the ground beneath the entire industry.

The next moves — in courtrooms, in Congress, and inside tech companies — will determine whether this moment becomes a turning point or just another chapter in a story that has been unfolding for years.

Post a Comment