Recent court rulings in California and New Mexico mark a significant shift in how tech companies are held responsible for the well-being of young users. Juries have found Meta (formerly Facebook) and YouTube liable for knowingly contributing to mental health issues and exposing children to predators, resulting in multi-million dollar judgments. This is not just about content; it’s about the design of these platforms.
A Turning Point in Tech Accountability
For years, advocates and former employees have alleged that Meta prioritized profit over safety. The legal system is now validating those claims. A California jury awarded $6 million to a young woman whose depression and anxiety were linked to compulsive social media use, finding Meta and YouTube acted with malicious intent. Simultaneously, a New Mexico jury ordered Meta to pay $375 million for failing to protect users from child predators on Instagram and Facebook, citing violations of consumer protection laws.
These verdicts matter because they establish precedent. Until now, tech companies have largely operated with impunity, shielded by legal loopholes and a lack of concrete accountability. The courts are now recognizing that platforms designed for addiction cause harm, and that companies bear responsibility for that harm.
Inside Meta’s Knowledge of Harm
The cases rely heavily on internal Meta documents, revealing a calculated approach to exploiting vulnerable users. One analysis explicitly identified tweens as the most valuable demographic due to their long-term engagement – children who join early stay longer. A 13-year-old user was internally valued at $270 in lifetime advertising revenue.
Even Meta’s own executives acknowledged the addictive nature of the platforms. Instagram CEO Adam Mosseri compared the experience to binge-watching Netflix while earning $10 million per year, a figure tied to growth metrics. Whistleblowers like Brian Boland testify that products were launched without safety testing, and concerns were ignored. Former employees say they were instructed to avoid creating discoverable records of Meta’s awareness of harm to children.
The Verdict: Design as the Problem
The lawsuits aren’t about specific posts or content, but the algorithmic loops that maximize engagement at any cost. Infinite scrolling, personalized feeds, and push notifications are designed to keep users hooked, regardless of the psychological impact. This design is particularly dangerous for children whose brains are still developing.
The juries found Meta negligent in designing platforms that exploit this vulnerability. They placed 70% of the responsibility for a young woman’s harm on the company. This is a direct rebuke of Meta’s public claims that social media doesn’t cause harm, statements made under oath before Congress.
What’s Next?
These verdicts are likely just the beginning. Thousands of similar lawsuits are pending, and this bellwether case will influence their outcomes. Meta has responded with superficial “safety features” that independent evaluations have found largely ineffective. The company’s approach mirrors past corporate tactics: create the illusion of change while preserving profits.
The core problem remains: tech companies’ business models depend on addiction, making self-regulation impossible. The court rulings confirm that external accountability is necessary. The juries have spoken; the question now is whether regulators and lawmakers will follow suit.
