A California Jury recently found Meta (parent company of Facebook, Instagram, and WhatsApp), and Alphabet (parent company of Google and YouTube) liable for as much as $6 million for causing a 20-year-old woman known as Kaley G.M. (Or K.G.M.) emotional harm when she was a minor. The harm was allegedly caused by K.G.M.’s social addiction, which was supposedly the result of algorithms designed to keep users focused on their screens.
K.G.M. claims her social media addiction led to low self-esteem, body image issues, depression, and suicidal thoughts. The lawsuit claims the tech companies were negligent in failing to mitigate the harms that could be caused by excessive use of their products. K.G.M. is the lead plaintiff—but the case is a consolidation of many of the thousands of cases that have been filed against social media companies.
If the jury’s verdict survives the inevitable appeals, it will open the floodgates to more lawsuits, including suits by ambitious State Attorneys General seeking a high-profile target to boost their public standing. The result will be a social media universe in which startup companies will find it difficult to attract capital—while existing tech companies increasingly restrict who can use their platforms and what they can post. It will start with major restrictions on minors’ ability to use social media—even with parental approval. It will then spread to restricting posts in the name of “protecting” vulnerable users from being harmed by posts, memes, and comment section trolls.
This is the future Congress was trying to avoid when it passed Section 230 of the 1996 Communications Decency Act. This is the section that protects platforms that do not exercise editorial control from being held liable for content posted on their site. K.G.M. and the other plaintiffs created a loophole in Section 230 by arguing that the harm suffered by the plaintiffs stems from the platform’s design—not from its content. So, the argument goes, Section 230 should not apply in this case.
The judges and the jury disregarded facts showing that Section 230 should apply to this case. For example, K.G.M. stated that she “has gotten a lot of content promoting body checking, posts [of] what I eat in a day—just a cucumber—making people feel bad if they don't eat like that.” Thus, she admits her problems were caused by content— which means Section 230 applies.
As Reason senior editor Elizabeth Nolan Brown pointed out, endless scroll and auto play have no power to keep people staring at their screens unless the users are captivated by the content. Brown further observes that clothing retailers regularly text or email customers with recommendations or notices about sales. These no doubt contribute towards creating and enabling shopaholics. Although shopaholics can struggle with credit card debt, no one is suing Macy’s for a failure to warn its customers of the dangers of excessive shopping.
The reason for the focus on the alleged dangers of social media addiction is rooted in the popular idea that technology is making people—especially young people—replace in-person social interactions with a virtual social life, leaving them vulnerable to the online trolls. This is debunked by a University of Florida study which found that children with smartphones are more likely to use FaceTime—thus interacting more with their friends than children without smartphones. The study did find that teenagers who regularly post on social media suffer from depression and sleep deprivation. However, the solution to problems caused by excessive social media use is not litigation and/or legislation.
Parents must use the many tools available to help them protect their children from inappropriate content. Parents should create strict rules setting limits on screen time—as well as a time each night when phones, iPads, and other electronic devices must be turned off. Cases like those of K.G.M. and other plaintiffs are tragic. However, solving them through litigation will drain resources away from one of the most innovative sectors of the U.S. economy—harming workers in the tech industry and the vast majority of social media users who know how to responsibly use the internet. Ensuring that children can responsibly use social media and avoid its hazards requires that parents work with big tech companies—and not make big tech the new big tobacco.