X
Story Stream
recent articles

It's surprising that more than a quarter century after its adoption, notorious Section 230 of the Communications Act of 1934, enacted in 1996 as part of the Communications Decency Act, has never been reviewed by the Supreme Court. That's about to change now – and if you've been reading the essays in my "Thinking Clearly and Speaking Freely" series, you'll know that I welcome the review. 

As construed by the lower courts that have had occasion to consider it, Section 230 provides Big Tech platforms like Twitter, Facebook, and YouTube, and other "interactive computer services," with near universal immunity from liability for posts by users of their platforms. 

On October 3, the Supreme Court granted certiorari in Gonzalez v. Google LLC. Plaintiffs are family members of victims who died in terrorist attacks for which ISIS claimed responsibility. They sued Google under the Anti-Terrorism Act, which allows victims to recover for injuries suffered "by reason of an act of international terrorism." In short, they contend Google is secondarily liable because its YouTube platform allowed ISIS to post videos and other content communicating the terrorist group's message, radicalize new recruits, and generally to further its mission. More specifically, the plaintiffs allege YouTube's computer algorithms, by suggesting content to users based on their viewing history, aids ISIS in spreading its jihadist messages. 

The Ninth Circuit Court of Appeals affirmed a District Court decision determining that Google, by virtue of Section 230's immunity grant, could not be held liable for plaintiffs' claims filed under the Anti-Terrorism Act. Section 230(c)(1) states that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." 

Here is the way the Supreme Court frames the question presented in Gonzalez

"Does section 230(c)(1) immunize interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limit the liability of interactive computer services when they engage in traditional educational functions (such as deciding whether to display or withdraw) with regard to such information?" 

Significantly, just a week before the Supreme Court announced it would hear the Gonzalez appeal, Part 12 of this "Speaking Freely" series, "Shining a Spotlight on Big Tech's Section 230 Immunity," was published. There, I examined the Fifth Circuit Court of Appeals' recent NetChoice, L.L.C. v. Paxton decision rejecting a First Amendment challenge to the Texas law prohibiting social media platforms from censoring user-generated posts based on viewpoint. Aside from all else, the Fifth Circuit's opinion highlights the dissonance between the platforms' oft-repeated claim, on the one hand, that they are mere conduits for the speech of others, and their claim, on the other hand, that Section 230 dictates that they must be treated as speakers and publishers for First Amendment purposes regarding their moderation actions. 

NetChoice already has indicated it will file a petition seeking Supreme Court review of the Fifth Circuit's decision. So along with the Court's review of Gonzalez, Section 230 is in the judicial crosshairs. 

In future parts of this series, I want to focus on the legal issues front and center in the Supreme Court's review of Gonzalez, and NetChoice too if review is granted. But here I want to begin thinking about, from a policy perspective, what a proper legal framework to replace the present Section 230 should look like. 

Of course, one option is outright repeal. This would leave the platforms in the same position as all other entities, including newspapers, broadcasters, cable operators, and such, subject to whatever judge-made immunities from liability that may be created on a case-by-case traditional common law fashion. In other words, Twitter, YouTube, Facebook, and the other social media websites would be subject to the same rules as others in our civil justice system. 

Short of outright repeal, Congress could revise Section 230 in ways that narrow the broad immunity currently enjoyed by the platforms. The present virtually unlimited immunity is exemplified by the Ninth Circuit's Gonzalez holding rejecting claims for relief under the Anti-Terrorism Act as well as many other lower court cases rejecting claims for redress. These claims run the gamut from postings that allegedly caused or facilitated sex trafficking, illegal drug sales, housing discrimination, fraudulent financial schemes, to other kinds of illegal or tortious conduct. 

In place of the present immunity, a revised Section 230 could substitute a reasonable duty of care standard that, presumably, taking into account ongoing judicial interpretations and the relevant facts of the particular case, would assess the reasonableness of a platform's actions. These considerations could include the size of a platform's user base, the financial resources available for moderating postings, the moderation mechanisms employed by a platform, a platform's compliance with its own procedures, a platform's knowledge regarding deficiencies discovered in its procedures, and efforts taken to cure known deficiencies. 

In a thoughtful paper, Who Moderates the Moderators?; A Law & Economics Approach to Holding Online Platforms Accountable Without Destroying the Internet, Geoffrey Manne, Kristian Stout, and Ben Perry reject the major platforms' repeated assertion that any lessening of their Section 230 immunity necessarily will radically alter their present business models, or "destroy the Internet." As they put it: "Counting the cost of defending meritorious lawsuits as an avoidable and unfortunate expense is tantamount to wishing away our civil-justice system. That is unlikely to be a defensible position in any regard, but it is certainly not defensible solely in the context of online platforms." As they say: "The current Section 230 doesn’t just reduce the liability risk of intermediaries for user-generated content; it removes it virtually entirely." 

Messrs. Manne, Stout, and Perry, in a way that builds on similar work by media and copyright lawyer Neil Fried, employ a "law and economics" cost-benefit approach in supporting adoption of a negligence-like rule based on a "reasonableness" standard of care. In their view, this would allow imposing some degree of intermediary liability on platforms, without opening the floodgates to unmeritorious litigation. They make clear that their proposal doesn't contemplate suits against the platforms for the underlying illegal or tortious conduct of users, but rather requires that the platforms take "reasonable steps to curb such conduct." Significantly, they highlight an exception to the general reasonableness rule for so-called communications torts like libel. Like offline publishers subject to the judge-made liability rule in New York Times v. Sullivan, online providers would not be liable for communications torts arising out of user-generated posts unless they knew, or should have known, the content was defamatory. 

Adoption of a reasonable duty of care standard in place of Section 230's present virtually unlimited immunity could take place either in a common law fashion under ordinary civil justice jurisprudence if Section 230 were repealed, or, alternatively, by virtue of Congress revising the provision to incorporate the reasonableness standard.

I'm not convinced the recommendations in the Who Moderates the Moderators paper, with its caveats, go far enough in the reform direction. But they are a good starting point for considering, aside from whatever the Supreme Court might do in Gonzalez, a proper framework for meaningfully reducing the platforms' current immunity to make them more accountable for their moderation actions. 

Stay tuned! I'll be returning to both the legal and policy considerations relevant to fixing Section 230 in future parts of this series.   

Randolph May is President of the Free State Foundation, a free market-oriented think tank in Rockville, MD.


Comment
Show comments Hide Comments