Justice Oliver Wendell Holmes, Jr. famously said "hard cases make bad law." Few cases are harder than one where the plaintiff is a grieving parent seeking to hold someone accountable for the death of their child. For example, the lawsuit brought by Norma Nazario against TikTok’s parent company Byte Dance and Instagram parent company Meta over the death of her 15-year old son Zachery. Zachery died while “subway surfing”, a term which refers to jumping onto a moving subway train. The lawsuit alleges that Zachery attempted this stunt because he “became addicted” to watching subway surfing videos on Instagram and TikTok.
Mrs. Nazario’s lawsuit is the latest challenge to Section 230 of the 1996 Communications Decency Act. Section 230 exempts tech companies from liability for materials posted on their sites if the company does not exercise editorial control over the users' posts. Section 230 is responsible for the rise of social media, which is why the law is a target of those seeking to impose new regulations on platforms like Instagram and TikTok. Section 230 is also a target of lawyers and judges who are searching for ways to limit Section 230’s protections. Mrs. Nazario’s attorneys argue that Section 230 should not apply in this case because TikTok and Instagram used recommendation algorithms. These algorithms suggested that Zachery look at videos featuring dangerous behavior such as subway surfing. The algorithms selected these videos to recommend to Zachery not because he watched or searched for them or similar videos—but because they are popular with other kids his age. The suit alleges that the algorithms promoted these violent and disturbing videos because they statistically keep young people engaged with the site for longer periods of time. And the longer they are on the site, the greater the ad revenue.
Algorithms also make it possible for social media companies to recommend content the company thinks would interest users based on their history, age, education level, and other characteristics. If companies could not use algorithms, they would have a harder time personalizing recommendations which would lessen the value individuals get from using social media. Mrs. Nazario's attorney says the use of algorithms mean the content is “actively” displayed. They claim that Section 230 is meant to protect “passive” displays, meaning the companies make no effort to recommend posts to people whose browsing history suggests they may enjoy it. This view would create an exemption to Section 230 that would swallow the entire law, making tech companies vulnerable to legal action if someone committed a crime and blamed it on videos suggested to them by the company.
The attorneys also argue Section 230 does not apply because the companies are being sued for offering a flawed product. The distinction between a social media platform that is protected by Section 230 and a product that uses algorithms to recommend content is a matter of semantics. It does not change the key question of whether the social media platform exercises any creative or editorial control in its postings, or is simply providing a platform. As Santa Clair University Law Professor Eric Goldman put it: “so long as the content is third-party content, it doesn't matter whether the service ‘passively’ displayed it or ‘actively’ highlighted it–either choice is an editorial decision fully protected by Section 230. As always, I ask: what is the product and warning about? If the answer to both questions is ‘third-party content,’ Section 230 should apply.”
One final argument against the plaintiff’s position is that even if social media companies use algorithms to direct young people to dangerous videos— the companies are still not responsible if someone chooses to copy the dangerous stunts they see online. As Reason magazine senior editor Elizabeth Nolan Brown said: “the fact that a particular dangerous or reckless thing might be showcased on social media platforms doesn't mean social media platforms caused or should be held liable for their[the viewer's] death. We don't blame bookstores, or movie theaters, or streaming platforms if someone dies doing something they read about in a book or witnessed in a movie or TV show.”
The way to prevent future tragedies like that of Zachery Nazario is for more parents to use one of the many tools available to protect their children from the dangers of social media. These provide an effective way to keep children safe while they learn how to responsibly use social media. Use of these tools also does not weaken the protection provided to online speech by Section 230 and the First Amendment.