Esc

Start typing to search

The addiction lawsuit that could break Social Media

A court just ruled Social Media is designed to addict your kids. Here's what that means for the internet.

Avatar of larsbecker
Lars Becker · /terminally online · 2 hours ago · 3 mins reading time
Image source: Tim Mossholder/Unsplash

For years, tech companies had one get-out-of-jail-free card. A law called Section 230 meant they couldn't be sued for what people posted on their platforms. Every lawsuit from every parent of every damaged kid died in court. The platforms were untouchable.

That changed on March 25.

A California jury ordered Meta and Google to pay $6 million to a woman who became addicted to Instagram and YouTube as a child. The payout itself means nothing to companies that size. What matters is how the case was won.

The case that changed everything

Kaley G. M. started using social media at six years old. By the time she was a teenager she was spending multiple hours a day on Instagram and YouTube. That eventually led to body dysmorphia and thoughts of self-harm.

Previous lawyers trying to hold platforms accountable always went after the content. Harmful posts, dangerous videos, toxic communities. That approach kept failing because Section 230 protected the platforms from being responsible for what users uploaded.

Kaley's lawyers did something different. They went after the design itself. Autoplay videos that start the next piece of content before you've even decided you want it. Endless scroll that removes any natural stopping point. Personalized recommendation algorithms built to serve you exactly what keeps you watching longest. None of that is user-generated content. All of it is a deliberate product decision made by engineers and executives.

Then came the internal documents. Emails and reports shown to the jury proved that leaders at both companies were aware their products were causing harm to young users and pushed forward anyway. That shifted the entire framing of the case. This wasn't a platform passively hosting bad content. This was a company knowingly building something damaging and putting it in front of children.

The jury agreed. TikTok and Snap were also named in the lawsuit but settled before it reached trial.

What it means for social media

One verdict doesn't rewrite the law overnight. But it gives every similar lawsuit in America a working blueprint, and there are thousands of them sitting in courts right now. Lawyers are already comparing this moment to the tobacco litigation of the 1990s, which didn't just cost cigarette companies money. It forced an entire industry to fundamentally change how it operated and what it was allowed to say.

If that comparison holds, the features that make social media so profitable are exactly what's now under threat. Autoplay, infinite scroll, and algorithmic recommendation feeds are the engine behind the engagement numbers that advertisers pay for. Restricting them would mean less time on app, fewer ads served, and significantly lower revenue.

The pressure isn't only coming from courtrooms either. The EU is threatening TikTok with fines worth 6% of ByteDance's global revenue over its addictive design features. Australia banned under-16s from social media entirely in December. The UK and Malaysia are considering similar moves. A survey across 30 countries found majorities everywhere support banning under-14s from these platforms.

Meta and Google are appealing the verdict. But the legal wall that protected the industry for nearly 30 years just developed its first serious crack.