🔴 BreakingLAW & TECHNOLOGY

Meta Hit with Double Blow in Historic Child Safety Verdicts

Two juries in two states find Meta liable on the same day — a $375 million child exploitation penalty in New Mexico and a $3 million addictive design verdict in Los Angeles — the first time U.S. juries have successfully pierced social media

March 25, 2026📖 6 min read

Historic First: Wednesday, March 25, 2026 marks the first time U.S. juries have successfully held a social media company liable for harming children — with two separate courts delivering verdicts against Meta on the same day. New Mexico: $375M civil penalty. Los Angeles: $3M compensatory damages with a punitive phase pending.

A Watershed Day for Tech Accountability

Wednesday, March 25, 2026 will be remembered as a turning point for the legal status of America's largest social media platforms. Two separate juries, operating independently in different states and on different legal theories, delivered liability verdicts against Meta Platforms on the same afternoon — representing the first time in U.S. history that juries have successfully pierced the legal shields that have historically protected social media companies from accountability for harm to minors.

The twin defeats expose Meta to immediate financial penalties totaling nearly $379 million, with additional punitive phases in both cases that could dwarf those initial figures. More consequentially, both verdicts may establish precedent that other plaintiffs and state attorneys general can deploy in the hundreds of similar cases currently working their way through courts across the country.

Verdict 1 — New Mexico: $375 Million for Child Exploitation

A Santa Fe jury ordered Meta to pay $375 million in civil penalties on Wednesday afternoon, concluding a six-week trial led by New Mexico Attorney General Raúl Torrez.

The case centered on Operation MetaPhile, a state-run undercover investigation in which New Mexico law enforcement agents posed as minors on Facebook and Instagram. Over the course of the operation, agents documented thousands of instances where Meta's recommendation algorithms actively connected the "child" accounts with adult predators — in some cases suggesting predator accounts to child accounts unprompted as "People You May Know."

Jurors found Meta guilty of violating New Mexico's Unfair Practices Act by systematically misleading parents and regulators about the safety of its platforms for minors. The $375 million figure represents the maximum civil penalty of $5,000 per documented violation — multiplying across the thousands of algorithmic connections the state was able to prove.

"Meta was not just negligent," Attorney General Torrez said after the verdict. "Meta's systems were actively working to connect children with predators, and the company knew it. This verdict says that is not acceptable and it has a price."

New Mexico — What Comes Next: A second trial phase begins May 4, 2026, during which a judge (not the jury) will determine additional financial penalties and potentially mandate specific platform design changes — including compulsory age verification systems on Facebook and Instagram.

Verdict 2 — Los Angeles: $3 Million for Addictive Design

Just hours after the New Mexico verdict, a Los Angeles jury ended nine days of stalled deliberations to find both Meta and YouTube (Google) liable for the mental health struggles of a 20-year-old woman identified in court records as Kaley G.

The Los Angeles case represented a distinct legal theory: rather than alleging specific acts of exploitation, plaintiffs argued that Meta and YouTube's core product features — infinite scroll, autoplay video, and push notification systems — were defective designs intentionally engineered to maximize engagement time at the expense of user wellbeing. The suit argued these features were specifically effective at creating compulsive use patterns in adolescent brains, causing documented psychological harm.

The jury awarded $3 million in compensatory damages, apportioned across the two defendants: Meta is responsible for $2.1 million (70%) and YouTube for $900,000 (30%). The significantly higher Meta share reflects the jury's finding that Instagram's design — and specifically its recommendation and notification architecture — bore greater responsibility for Kaley G.'s documented harm than YouTube's autoplay system.

Los Angeles Trial: Key Findings

Feature Detail
The Allegation Infinite scroll, autoplay, and push notifications are "defective" designs intended to hook children and adolescents
Total Award $3 million in compensatory damages
Liability Split Meta: $2.1M (70%) — YouTube / Google: $900k (30%)
Finding of Malice Jury found both companies acted with malice, oppression, and fraud — triggering a separate punitive damages phase
What's Next A new trial phase will determine punitive damages, which in California can multiply compensatory awards many times over

Why These Verdicts Are Different

Plaintiffs and state attorneys general have filed hundreds of cases against Meta over child safety in recent years. Most have failed at the pleading stage, dismissed on the grounds that Section 230 of the Communications Decency Act immunizes platforms from liability for third-party content. The companies have also successfully argued — until now — that product design decisions like algorithmic recommendations are protected editorial choices, not actionable defects.

Both Wednesday verdicts found ways around those defenses. The New Mexico case focused on Meta's own algorithmic conduct — the active recommendation of predator accounts to child accounts — rather than on any content the predators posted, sidestepping the Section 230 shield. The Los Angeles case relied on product liability law: arguing that a design feature can be defective and harmful regardless of the content it surfaces, in the same way a defective seatbelt can cause injury regardless of the car crash that triggered it.

Legal scholars watching both trials say the two theories are potentially more dangerous to Meta's long-term legal exposure than any single large verdict, because they provide roadmaps that other plaintiffs can now follow in jurisdictions across the country.

Meta's Response

Meta released a statement Wednesday evening saying it "strongly disagreed" with both verdicts and intended to appeal. The company emphasized its existing child safety investments — including age verification testing, parental supervision tools, and content restrictions for users under 18 — as evidence that its platforms are "not designed to harm children."

Meta's legal team is expected to mount aggressive appeals in both cases, arguing in New Mexico that the Unfair Practices Act does not apply to algorithmic recommendation systems, and in Los Angeles that the product liability defect theory is preempted by federal communications law.

Wall Street reacted immediately. Meta shares fell approximately 4% in after-hours trading following news of the dual verdicts, erasing roughly $50 billion in market capitalization on concern that both the financial penalties and the legal templates they establish could compound significantly through 2026 and 2027.

The Bigger Picture

Wednesday's verdicts arrive as Meta is simultaneously managing the fallout from a separate round of layoffs — reorienting its capital away from Reality Labs and toward AI infrastructure — and facing increasing regulatory scrutiny in the European Union, the United Kingdom, and several US states.

For the broader social media industry, the day represents a potential inflection point. Google's YouTube was found liable alongside Meta in Los Angeles, suggesting that the addictive design theory is not specific to Meta's architecture. If punitive damages in that case are substantial, every major platform with recommendation and autoplay systems faces related exposure.

The Senate Judiciary Committee has already announced it will hold a hearing on the verdicts and their implications for federal child safety legislation — including whether Congress should explicitly carve child safety claims out of Section 230 protections.

Tags

#Meta#Child Safety#Lawsuit#Instagram#Facebook#YouTube#New Mexico#Los Angeles#Tech Law#Social Media Liability

Discussion

Sign in to join the conversation

Your comments appear live in our Discord server — every post grows the community.

Every comment appears live in our Discord server.

Join to see the full conversation, get notified on new articles, and connect with the community.

Join ObjectWire Discord

Comments sync to our ObjectWire Discord · Meta Hit with Double Blow in Historic Child Safety Verdicts.

A

Written by

Alfansa

Legal & Technology Reporter

Part ofObjectWirecoverage
📩 Newsletter

Stay ahead of every story

Breaking news, deep-dives, and editor picks — delivered straight to your inbox. No spam, ever.

Free · Unsubscribe anytime · No ads