X
Story Stream
recent articles

For years, litigation over 47 U.S. Code § 230 — known better as Section 230 — acquired a Groundhog Day–esque quality. Despite the law’s plain language, which absolves online platforms of most civil liability for the speech of third-party users, myriad plaintiffs have brought suits seeking to contort its meaning. Judges have almost exclusively turned away these efforts and maintained the Section 230’s integrity. The cycle went thus: Frivolous lawfare would beget sensible, textualist rebuffs — and the cycle would continue.

The cycle broke at the U.S. Court of Appeals for the Third Circuit, where a panel ruled late last month that Section 230 afforded TikTok no protection for content proffered to user by its “For You Page.” According to the panel’s reasoning, the act of algorithmically suggesting content constitutes “TikTok’s own expressive activity” and, therefore, has no claim to statutory protections concerning third-party content. The opinion — should it stand on further appeal — would render Section 230 a husk of empty statutory language, robbed of its substance, force, and original meaning.

A fundamental analytical error runs through the Third Circuit’s opinion. The panel assumed that the respective protections of the First Amendment and Section 230 cannot shield the same speech. If a platform enjoys the one, the reasoning goes, it cannot enjoy the other. This badly mangles the law. Congress meant for Section 230 to overlap with constitutional protections, allowing platforms to moderate content and facilitate online discourse without threat of crippling civil suits over stray pieces of user-generated content. The double insulation served was precisely the point.

The Third Circuit’s reasoning rested heavily on a tortured reading of the Supreme Court’s recent decision in Moody v. NetChoice, which — far from creating new law, as the panel seemed to think — merely restated the obvious conclusion of durable precedent. The Court’s majority affirmed that online platforms’ content-moderation policies — like any act of editorial discretion — enjoy the ordinary speech protections guaranteed by the First Amendment. Nowhere did the majority suggest that Section 230 operates only where the Bill of Rights does not.

If followed to its conclusion, the court’s premise — that if First Amendment encompasses content moderation, then Section 230 does not — functionally voids the law. This judge-made carveout — invented ex nihilo — would become so large as to swallow the entirety of the law, with flagrant disregard for congressional intent. 

The panel seemed to realize this fact and fell over itself back-peddling. Scrambling to avoid the necessary consequences of its own reasoning, it suggested that a search function, unlike the For You Page, might warrant Section 230 relief. Of course, search functions, like discovery pages, involve content curation, promotion and demotion, and editorial decision with respect to what content rises atop the results. The reality — which the panel failed to see — is glaring and simple. Practically any content a user accesses on a mainstream social-media platform will arrive through some kind of content-moderating algorithm (excepting specific search queries), which, per NetChoice, enjoy First Amendment protection.

The logic of the panel’s opinion reaches to the core of the form and function of social media. As Corbin Barthold, internet policy council at TechFreedom, noted, “a platform’s decision to host a third party’s speech at all is also First Amendment-protected expression. By the Third Circuit’s logic, then, such hosting decisions, too, are a platform’s ‘own first-party speech’ unprotected by Section 230.” The panel avoided making manifest this reductio ad absurdum only by concocting arbitrary divides between similar platform features or simply by neglecting to consider the clear endpoints towards which its logic pointed.

Section 230 provided the legal soil whence the modern free internet grew. It facilitated platform-mediating fora, which, in turn, produced an unprecedented proliferation of communication and information. Such mediation brings order from chaos. Without it, users would face an unnavigable morass of disordered content, dominated by unsavory characters. 

Courts that insist on bulldozing carefully crafted regulatory regimes, despite knowing little about technology or the law that governs it, will produce little besides a shrunken, impoverished online ecosystem. Their decisions, in application, will quash speech — not promote it. Put up your swords, judges. You know not what you do.

David B. McGarry is a policy analyst at the Taxpayers Protection Alliance and a social mobility fellow at Young Voices. His work has appeared in publications including The Hill, Reason, National Review, and the American Institute for Economic Research. @davidbmcgarry


Comment
Show comments Hide Comments