The Harms of the Overbroad Interpretation of Section 230

Lauren Newton

Professor Abah

JOUR 301

20 November 2020

The Harms of the Overbroad Interpretation of Section 230

  1. Introduction

Over 20 years ago, the landscape of the internet was changed profoundly following the implementation of Section 230 of the Communications Decency Act of 1996. In 1996, the internet was at its infancy and its imminent power was not yet known; at the time, the legislation’s priority was ensuring free growth of the internet for users and service providers alike. Section 230 was created by Congress to facilitate the future digital world while accommodating competing values -- protection for children, regulating harmful content, and encouraging Good Samaritan actions online (Dickinson, 864). However, the language of Section 230 is often found overbroad, and the courts have interpreted the statute with far broader immunity than the authors intended. Immeasurable websites providing helpful tools to users have flourished as a result of Section 230, however “bad actors” hosting malicious content have prospered as well. Although the world of the internet has evolved drastically over the past few decades, Section 230 has not. 

This paper will evaluate the destructive consequences of the overbroad interpretation of Section 230. The statute is providing sweeping immunity to internet service providers (ISPs), even when varying levels of liability are undoubtedly warranted. Platforms have been given immunity even when they actively induce malicious content, encourage users to post illegal content, or knowingly host harmful content yet make no attempts to remove it (Ciltron, 460). I hypothesize that if Section 230 is amended to hold ISPs accountable for the inducement of malicious content and to reflect the difference between “good” and “bad” actors online, the Internet will evolve into a safer, less oppressive environment to internet users across the country. To prove this hypothesis, the paper will analyze the initial intent of Section 230, 

past court decisions, and opinions from scholarly articles and relevant judicial characters. 

  1. Origins of Section 230

For over two decades, courts have struggled to match their interpretations of the language of Section 230 with the statute’s original intent. §230 was implemented with two primary objectives: firstly, Congress sought to promote and encourage freedom on the Internet. Secondly, Congress desired to protect children from obscenities and harmful content online (Dickinson, 864-865). However, judicial precedent has placed a far greater emphasis on §230’s first goal and allowed ISPs very broad immunity.

Section 230 was primarily enacted in response to the perplexing Supreme Court of New York’s 1995 decision in Stratton Oakmont, Inc v Prodigy Services Co. In the case, Stratton Oakmont, an investment-banking firm, sued Prodigy, an online platform, for libel for comments posted by users in its online bulletin board. Prodigy had exercised editorial control over some postings they considered inappropriate or harmful (Leary, 560). The court found that because Prodigy had an active role in screening objectionable content online, the website would be considered a publisher rather than a mere distributor of third-party content. Being labeled a publisher, Prodigy was thus responsible for the material it published, and the court ruled in favor of Stratton Oakmont (561).

This decision established that any level of editorial control over content opened ISP’s to liability, implying that ISPs who do not regulate content at all have greater immunity. This interpretation thus disincentivized ISPs from regulating content online at all, even if they were monitoring harmful and objectionable content on a good faith basis (Bolson, 5). To alleviate these concerns and provide protection for “good actors,” just one year later Congress drafted and adopted the Communications Decency Act.

  1. Language and Interpretation of Section 230

Section 230 functions straightforwardly: ISPs and websites are given immunity from defamation liability for content created by third-parties in order to encourage free speech and protect children online. The actual text and language of Section 230 is brief, and the heading reads “Protection for private blocking and screening of offensive material.” Sections 230(a) and (b) outline Congress’s preliminary findings, including the importance and rapid development of the internet and five policy aims targeting removing disincentives to monitoring content and encouraging technological advancement (Quisf, 282-283).

Section 230(c) is arguably the most important and referenced text of the legislation, and it is divided into two parts. Section 230(c)(1), titled “Treatment of publisher or speaker,” details that “no user of an interactive computer service shall be treated as the speaker of any information provided by another information provider.”  Section 230(c)(2), titled “Civil liability,” explains that “no provider or user of an interactive service shall be held liable on account of” their “Good Samaritan” attempts to monitor and restrict objectionable or obscene content online (Quisf, 283).

The language of Section 230 directly reversed the ruling of Stratton Oakmont, allowing ISPs to regulate content online without fear of liability. Internet companies would now be able to flourish without the fear of crippling regulation (Bolson, 10). Although the goals to incentivize ISPs to police their websites and to limit access to obscene material require balancing, it is apparent from both the text and the legislative history of Section 230 that the intention was never to provide absolute immunity for any and every action taken by ISPs (Leary, 564).

IV. Application and Implications of Section 230

Courts have interpreted a broad view of Section 230 immunity since the beginning of its application. The U.S. Court of Appeals for the Fourth Circuit was the first court to interpret the statute in the case of Zeran v. America Online, Inc, their decision beginning a string of broad interpretations. In the case, an anonymous user created a heinous Internet hoax against a man named Kenneth Zeran on American Online’s “bulletin board”. The user impersonated Zeran online and provided his telephone number in connection with advertisements for shirts that glorified a bombing in Oklahoma City (Bolson, 9-10).  Zeran argued that AOL was aware of the malicious content but failed to remove it in a timely manner, and was therefore liable under traditional notions of distributor liability. The Fourth Circuit sided with AOL, concluding that Section 230 blocked all causes of action that would hold “service providers liable for information originating with a third-party user.” (Leary, 574). In other words, the court held that so long as the ISP is not the direct author of content published on their platforms, it cannot be held liable, regardless of the motivation behind or levels of its editorial control.  

With the court ruling in favor of AOL, Mr. Zeran was left with no recourse for the harm inflicted upon him. He couldn’t track down the poster of the libelous content, couldn’t prevent other users from posting similar defamatory content, and couldn’t sue AOL. The precedent set in Zeran has left countless victims in identical, helpless situations and prioritized freedom of internet expression over all other goals, even the goal of protecting children. In the case of Doe v. America Online, the plaintiff accused AOL of knowingly permitting and distributing advertisements for child pornography. Despite the fact that the case involved serious allegations of child harm and illegal activity, the court rejected the plaintiff’s argument, quoted Zeran extensively, and perpetuated the broad interpretation of Section 230 (Leary, 575).

Websites created for the sole purpose of defaming and harming others are scrutinized just as lightly as websites that act in good faith. The site “Dirty.com,” for example, was created with the sole purpose to spread harmful rumors and gossip. The site’s founder, Nik Richie, encourages users to submit their best “dirt” and chooses his favorite submissions to post online. These posts have led to a torrent of abuse and describe topics including sexually transmitted diseases, mental illnesses, and financial problems. Richie is clearly a bad actor, and even admitted to “ruin[ing] people sometimes out of fun” (Ciltron, 454). 

Litigation against websites like Dirty.com have rarely resulted in justice for victims of online harassment. Sarah Jones, a former high school teacher, sued Dirty.com after the website refused to remove defamatory content regarding her personal and professional life. The posts falsely stated that her husband had a sexually transmitted disease, that she had slept with an entire football team, and that she had had sex at the school where she taught (Jones v. Dirty Records Entertainment Recordings). Despite the fact that Jones repeatedly begged site editor Richie to remove the content and the fact that Richie did intentionally encourage the objectionable content, the Sixth Circuit Court of Appeals ruled in favor of Dirty.com. The court of Jones v. Dirty Records Entertainment Recordings, LLC stated that the selecting of which posts to publish and the refusal to remove content do not qualify as material contributions (Jones v. Dirty Records Entertainment Recordings). Jones had no other legal options to fight these incredibly harmful and libelous postings, despite the fact that they could irreparably damage her reputation.

V. Judicial Movement toward Limited Immunity

Although courts remain constricted by the Zeran ruling, there is a growing judicial movement to limit Section 230’s immunity. The first ruling to offer hope for harassment victims and consider a more limited immunity to ISPs occurred in 2008 in the case of Fair Housing Council of San Fernando Valley v. Roommates.com. The plaintiffs alleged that Roommates.com was responsible for violating housing discrimination regulations by requiring its users to detail discriminatory information about themselves through a drop-down question feature. The plaintiffs argued that because the drop down questions were required and directly prompted discriminatory answers, Roommates.com should be considered a content provider. The court agreed, and the notion that an internet service provider could simultaneously be a content-provider was born (Bolson, 13-14).

Roommates was one of the few cases to find potential liability for a website. The court acknowledged that Roommates.com drop-down questions were the direct cause of illegal material being displayed, so the website should not have blanket immunity. The ruling created an alternative path of interpreting Section 230, a path more in line with the original text and purpose of the statute (Leary, 576). However, this decision doesn’t affect all Section 230 circumstances, namely when websites are “passive transmitters of information,” and websites hosting harmful content continue to receive sweeping immunity (Bolson, 14).

VI. Overarching Problems with Section 230

The internet has evolved into a ubiquitous part of society, a place that should welcome the marketplace of ideas and encourage passionate discussion and community growth. However, it cannot function as an open, accepting forum so long as so many users are subjected to unthinkable harassment and victimization. According to a Pew Research Center Report, around 73% of adult internet users have witnessed online harassment of others and 40% have personally experienced harassment (Bolson, 11). The open and free environment of the internet has been shadowed by its growingly oppressive nature, discouraging users from participating online altogether. Online abuse inevitably chills the speech of users unwilling to subject themselves to further victimization, and cyber harassment is “profoundly damaging to the free speech and privacy rights of the people targeted” (Ciltron, 472). 

The internet is no longer at its infancy, and no longer needs to be coddled and protected to ensure growth. The immunity granted in Section 230 is massively inconsistent with nearly all other statutes involving distributor or publisher liability; for example, a magazine whose sole purpose is to publish user-submitted malicious content about nonpublic figures would be subjected to a storm of lawsuits (Ciltron, 455). 

The courts have interpreted Section 230 with immunity far broader than the authors of the statute intended. In Justice Thomas’s recent statement suggesting that immunities granted ot ISPs could be narrowed in future court proceedings, he emphasized that “both provisions in §230(c) most naturally read to protect companies when they unknowingly decline to exercise editorial functions to edit or remove third-party content… and when they decide to exercise those editorial functions in good faith” (Malwarebytes v. Enigma Software Group). However, companies are not only knowingly declining to exercise editorial control over malicious content, but they are also encouraging and facilitating in the dissemination of the content. The “Good Samaritan” provision of  §230(c)(1) has been virtually negated by the courts’ granting of sweeping immunity, allowing “bad actors” to thrive; lower courts have ironically interpreted §230, which holds the title “protection for private blocking and screening of offensive material,” to protect sites designed with the sole purpose to spread offensive material (Ciltron, 455). 

This overbroad application of §230 has allowed platforms a “free pass” to ignore harmful activities, to induce and encourage unlawful activities, and to deliberately repost illegal material (Ciltron, 465). Bad actors -- Nik Richie, for example -- continue encouraging destructive materials on their platforms, knowing well they cannot be sued for their roles in abuse. Sweeping immunity incentivizes injurious behavior, including creating sites for the sole purpose of imposing severe humiliation, destruction, and emotional distress on others.    

VII. Recommendation

I propose a statutory amendment to §230 that would prevent “bad actors” from sweeping immunity and create a distinction between Good Samaritans and Bad Samaritans. Platforms who deliberately host harmful materials and encourage users to post harmful materials should not be granted immunity. Provisions must be in place to incentivize good behavior and to hold irresponsible behavior accountable. 

The current test adapted from §230 is three-pronged and determines if the CDA provides an ISP defendant immunity. The test is as follows. First, the defendant must be a provider or user of an interactive computer service. Second, the plaintiff’s cause of action must view the defendant as the “publisher” or “speaker” of a harmful statement. Third, the harmful information was provided by another information content provider, other than the defendant. I propose that three additional prongs be added to the test, as follows. First, whether the primary purpose of the websites is constructive and practical. Second, whether a reasonable person would be highly offended by the content and efforts put forth by the platform. Third, whether the ISP had an active or passive role in receiving and dispersing third-party information. 

I believe assessing the primary purpose of a website would provide far vaster accountability for websites such as Dirty.com, whose sole purpose is to harm and defame others. This prong, in addition to the second that addresses the degree of offensiveness of platforms, would effectively distinguish between Good and Bad Samaritans and would incentivize good behavior online, as the statute originally intended. Additionally, in accordance with the third new prong, to statutorily emphasize the difference between an active and passive role in distribution online would provide a legal framework to continue rulings similar to that of Fair Housing Council v. Roommates.com: an ISP can simultaneously be a content-provider. Assessing the level of involvement the ISP displayed is crucial in determining the level of liability that is just. 

This statutory change would have no effect on websites that operate in good faith. Similarly, it would not create liability for platforms that unknowingly host malicious content or that are logistically unable to monitor all content online. The sole actors this change would affect are those with bad intentions who encourage and facilitate in distributing malicious, harmful, illegal, and destructive content. 

As Justice Thomas said in his statement, “paring back the sweeping immunity courts have read into §230 would not necessarily render defendants liable for online misconduct. It would simply give plaintiffs a chance to raise their claims in the first place,” (Malwarebytes v. Enigma Software Group). Victims of online harassment and abuse deserve recourse and options to pursue justice. The amendments I’m proposing would give victims a platform in court, because currently, so many cases are discarded due to §230 immunity before they are even tried. 

Section 230 has allowed expression and innovation far beyond the imagination of online operators in 1996. But, the judicial interpretation has left victims of online abuse with no leverage against sites whose model encompasses abuse and destruction. The internet operates as a zone of public discourse, and all members of the community should feel safe to participate rather than fearful to be subjected to abuse. 

Works Cited

Bolson, Andrew P. “Flawed but Fixable: Section 230 of the Communications Decency Act at 

20.” Rutgers Computer & Technology Law Journal, vol. 42, no. 1, June 2016, pp. 1–18. EBSCOhost, search.ebscohost.com.

Ciltron, Danielle K. “The Problem Isn't Just Backpage: Revising Section 230 Immunity.” 

Georgetown Law Technology Review, vol. 2.2, 2008, pp. 453–473., doi:https://scholarship.law.bu.edu. 

Dickinson, Gregory M. “An Interpretive Framework for Narrower Immunity under Section 230

of the Communications Decency Act.” Harvard Journal of Law & Public Policy, vol. 33, no. 2, Spring 2010, pp. 863–883. EBSCOhost, search.ebscohost.com

LEARY, MARY GRAW. “The Indecency and Injustice of Section 230 of the Communications 

Decency Act.” Harvard Journal of Law & Public Policy, vol. 41, no. 2, Spring 2018, pp. 553–622. EBSCOhost, search.ebscohost.com

New York Supreme Court. Stratton Oakmont, Inc v. Prodigy Services Co. 24 May 1995. 

Supreme Court of the United States. Malwarebytes, Inc v. Enigma Software Group USA, LLC. 13

 Oct. 2020. 

Quisf, Mark D. “‘Plumbing the Depths’ of the Cda: Weighing the Competing Fourth and Seventh 

Circuit Standards of Isp Immunity under Section 230 of the Communications Decency Act.” George Mason Law Review, vol. 20, no. 1, Fall 2012, pp. 275–309. EBSCOhost, search.ebscohost.comU.S. Court of Appeals for the Fourth Circuit. Zeran v. American Online, Inc. 12 Nov. 1997. 

U.S. Sixth Circuit Court of Appeals. Jones v. Dirty Records Entertainment Recordings, LLC. 16 

June 2014. 

 

 

Lauren Newton Art

I am an artist, writer, and successful business owner that brings creative solutions to strategy roles. Having sold over 650 commissions, from photorealist portraits to abstract designs, I have a track record of combining artistic expression with business acumen. I bring high communication skills and attention to detail to the table and thrive managing multiple deadlines.

https://www.laurenewtonart.com
Next
Next

My Last Shot at Georgetown