This is the property of the Daily Journal Corporation and fully protected by copyright. It is made available only to Daily Journal subscribers for personal or collaborative purposes and may not be distributed, reproduced, modified, stored or transferred without written permission. Please click "Reprint" to order presentation-ready copies to distribute to clients or use in commercial marketing materials or for permission to post on a website. and copyright (showing year of publication) at the bottom.

Technology,
Constitutional Law

Feb. 6, 2026

Section 230 and the litigation system it quietly built

Section 230 remains a decisive litigation gatekeeper, but as platforms algorithmically shape and generate content, courts are being forced to redraw lines between user speech and platform conduct.

Shon Morgan

Partner and Chair of the National Class Action and Mass Arbitration Group
Quinn Emanuel Urquhart & Sullivan, LLP

865 S Figueroa St, 10th Floor
Los Angeles , CA 90017

Phone: (213) 443-3252

Fax: (213) 443-3100

Email: shonmorgan@quinnemanuel.com

Harvard University Law School; Cambridge MA

See more...

Section 230 and the litigation system it quietly built
Shutterstock

In courtrooms around the country, lawsuits over online content often rise or fall on a question that sounds deceptively simple: Who is the speaker? The user who posted the content, or the platform that hosted, organized or amplified it? The answer frequently determines whether a case ends at the pleading stage or proceeds into years of discovery and potential classwide exposure.

That line-drawing exercise is the day-to-day reality of Section 230 of the Communications Decency Act. From a litigator's perspective, the statute is not just a policy statement about the internet. It is one of the most powerful procedural filters in modern civil litigation, shaping which cases can be filed, how they are pleaded and whether they are resolved early or proceed to full-scale discovery and possible trial.

That gatekeeping function matters because litigation is not a neutral policy tool. Even weak claims can carry significant settlement pressure once discovery costs, reputational risk and the aggregate exposure of a class action enter the picture. Section 230 prevents many claims based on third-party content from moving forward, sparing platforms from the leverage that comes with protracted litigation. Just as importantly, it prevents 50 different state tort regimes from effectively setting speech rules through jury verdicts that would inevitably conflict and sow confusion.

That structure still serves an essential role. Without it, platforms would face strong incentives to remove lawful but controversial speech simply because it creates legal risk. Moderation decisions would be driven less by community standards and more by what might look safest to a jury. The result would be a more cautious, more legally sanitized internet--one shaped heavily by litigation avoidance rather than by open participation.

But the cases being filed today look very different from those of the dial-up era. Platforms no longer simply host user posts. Many now rank, recommend, auto-play, suggest connections and deploy algorithms that determine what content travels furthest and fastest. Increasingly, they also use artificial intelligence tools that summarize, generate or transform content in ways that blur traditional lines between user speech and platform output. Courts must work with statutory language and legal precedents developed in an earlier technological era as they assess new forms of digital conduct that do not fit neatly into "publisher" or "product" boxes.

For plaintiffs' lawyers and victims' advocates, Section 230 can feel less like a speech safeguard and more like a rule that leaves certain claims without a clear path forward. When harms are tied not just to what users say, but to how platforms structure or present information, plaintiffs' lawyers are unsurprisingly asking that those platform choices be treated differently from traditional publishing decisions.

Courts have responded with increasingly nuanced--and sometimes inconsistent--approaches as they try to apply existing legal categories to new technologies. Many cases now turn on whether a claim truly targets third-party content or instead challenges a platform's own conduct, such as product design, warnings or recommendation systems. The doctrinal fight is no longer just "publisher versus speaker." It is whether digital tools that sort, prioritize and surface information should be treated as editorial judgments, product features or something in between.

That evolution also helps explain why Section 230 disputes now look less like mechanical immunity questions and more like careful exercises in line-drawing. Although Section 230 does not purport to alter First Amendment doctrine, it often functions as a kind of procedural fast lane to resolve disputes that sit close to First Amendment concerns. By defining when a platform is treated as the speaker or publisher of third-party content, the statute allows courts to address threshold questions about responsibility for speech without wading into more complex constitutional analysis. Because these issues are adjacent to the First Amendment--and First Amendment inquiries themselves are often highly context dependent--it is neither surprising nor particularly troubling that careful line-drawing now sits at the center of Section 230 litigation.

None of that means the statute is either beyond critique or ready for wholesale repeal. Removing Section 230 would not simply produce more accountability; it would likely produce more aggressive content removal, more cautious platform features and fewer opportunities for ordinary users to speak and share online without heavy moderation or exclusion.

The most productive path forward is unlikely to be found in slogans about immunity or censorship. It lies in the more difficult task of determining when claims are truly about third-party content and when they are directed at a platform's own conduct as recognized under established law. That is a subtle line, and it is being drawn incrementally in courtrooms every day.

In practice, Section 230 is neither a relic nor a cure-all. It is a blunt rule performing delicate work--preserving space for open participation while courts and lawmakers work to define how far legal responsibility should extend when digital architecture intersects with real-world consequences.

#389658

For reprint rights or to order a copy of your photo:

Email Jeremy_Ellis@dailyjournal.com for prices.
Direct dial: 213-229-5424

Send a letter to the editor:

Email: letters@dailyjournal.com