The Verdict
A New Mexico state court jury found Meta liable for nearly $400 million in civil damages. The charge: failing to protect children from predators on Facebook and Instagram. The state attorney general proved that Meta knew its platforms were being used to target minors and did not act.
This is not a fine from a regulator. This is a jury of citizens saying a $1.5 trillion company failed at the most basic responsibility — keeping kids safe.
What Was Proven
The trial centered on allegations that Meta violated New Mexico consumer protection laws and actively misled residents about the safety of its apps. The evidence showed that Meta had internal data about predatory behavior on its platforms and chose engagement metrics over child safety.
The algorithm that keeps you scrolling is the same algorithm that connects predators to children. Meta knew this. The jury agreed.
Why This Matters Beyond the Fine
$400 million is a rounding error for Meta. They made $134 billion in revenue last year. But the precedent is not about the dollar amount. It is about liability.
Every other state attorney general is now looking at this verdict and asking: can we do the same thing? If New Mexico won on consumer protection grounds, California, Texas, New York, and 47 other states have the same laws. This is the beginning of a wave, not an isolated case.
For $META shareholders, the risk is not this verdict. It is 50 more of them.
The Regulatory Trajectory
Congress has been talking about children online safety for years and doing nothing. Juries are doing what Congress will not. The KOSA bill has stalled repeatedly. State-level litigation is filling the vacuum.
Meta, TikTok, Snap, and X all face similar exposure. But Meta is the biggest target because it has the most users, the most data, and now the most damning verdict on record.
