Section 230 Explained: What is it? What are its implications? How might it change?

Section 230 Explained: What is it? What are its implications? How might it change?

What is Section 230?

Section 230 of the United States Communications Decency Act is one of the most influential pieces of internet-related legislation and the keystone of the internet as we know it today. In fact, it has been referred to as the “internet’s Magna Carta” and the “26 words that created the internet.” The 26 words of Section 230 are as follows:

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

What does that mean? In simple terms, it means that any company providing an interactive service on the internet (or any user of it) cannot be held liable for the content shared by a third party on that service (e.g., if you post something libelous on a social media platform, the social media platform and its other users aren’t liable). 

In addition, Section 230 also has a “Good Samaritan” clause which provides immunity to service providers and users for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected”. In other words, they can’t be held liable for moderating content on their platforms, as long as it is done in good faith.

That being said, the immunity protections aren’t limitless. The following exemptions apply to Section 230 immunity:

  • Federal criminal liabilities
  • Electronic privacy violations
  • Intellectual property claims

This is one of the reasons (along with DMCA - Digital Millennium Copyright Act) why companies are very quick in taking down copyright infringing content since they can be litigated over it.

What are the implications of Section 230?

Signed into law in 1996, this immunity from being litigated for a third party’s content on a platform has shaped the internet as we know it today. These protections are what enable customer reviews and comment sections on websites, blog and content hosting services, literally all of social media, wikis, and much more to thrive on the internet without fear of litigation for content their users share on the platforms. Many of the most popular websites and services (e.g., Wikipedia, Google, Facebook, Amazon, Twitter, Substack, Squarespace, Webflow, AWS, Cloudflare) benefit from these protections.

However, as the internet has grown in scope and influence beyond what anyone likely imagined in 1996 when the protections were established, there has been a growing call to reform Section 230 to better protect user and societal interests. It is said “with great power comes great responsibility,” but in the current state of the world, internet giants have amassed great power, but managed to shun much of the responsibility due to the broad scope of immunity granted by Section 230. This is especially true for social media platforms as they have a massive impact on societal health and civic discourse.

How might Section 230 be adjusted for social media companies?

First off, in recent years, adjustments have been made to Section 230. Namely, FOSTA (Fight Online Sex Trafficking Act)-SESTA (Stop Enabling Sex Traffickers Act) which removed Section 230 immunities for service providers in cases pertaining to sex trafficking. Like any legislation, this adjustment to Section 230 had its share of supporters and critics, but more importantly it has set a precedent for Section 230 being adjusted. 

So the question is how should Section 230 be adjusted for social media platforms given all the issues that have come to light such as the spread of misinformation/fake news, propaganda, hate speech, and harassment on these platforms?

A variety of ideas have been introduced into the public arena ranging from a full repeal of Section 230 to more nuanced refinements. This is due to the politically left-leaning and right-leaning groups in the US having different concerns about Section 230. The left argues that Section 230 allows tech platforms to host harmful content with no major repercussions, while the right argues it allows tech platforms to unfairly censor right-leaning perspectives. 

The ideas proposed by legislators and others fall loosely into 3 major categories which are discussed below along with one proposed legislation (arbitrarily chosen as an illustrative example) in each category:

Repeal Section 230:

Intention: Scrap Section 230 and replace it with new legislation which redefines how tech platforms can operate.

Proposed legislation: 21st Century Foundation for the Right to Express and Engage in Speech Act (Sponsor: Sen. Bill Hagerty (R-TN))

  • Remove Section 230 and replace it with the rules outlined in this bill
  • Tech platforms would be treated as “common carriers” and be required to provide service to everyone as long as basic requirements are met (e.g., like a utility such as mobile phone service) - This only applies to services with at least 100M global monthly active users
  • Companies deemed “common carriers” would be required to publicly disclose their policies regarding content moderation, promotion, curation, and account suspension
  • This new legislation would include a Good Samaritan clause, but those protections would only apply in accordance with the company’s posted moderation policies
  • Section 230-like protections would still exist, but not apply if a platform manipulates the visibility of content (e.g., uses engagement-based algorithms, recommends content, or restricts access to content)
Limit the scope of Section 230:

Intention: Restrict the scope of how and when Section 230 immunity can be invoked

Proposed legislation: Protecting Americans from Dangerous Algorithms Act (Sponsors: Rep. Tom Malinkowski (D-NJ), Rep. Anna Eshoo (D-CA))

  • Prevents platforms from using Section 230 as a defense in civil rights and terrorism related cases if the company uses algorithms to distribute or amplify the content in question

The following are exempt from this definition:

  • Algorithms that are “obvious, understandable, and transparent” are exempt, such as, chronological order, alphabetical order, avg. user rating, sort by review count, etc.
  • If a user searches for specific information and an algorithm is used to return relevant results
  • Providers of internet infrastructure services - including web hosting, domain registration, content delivery networks, caching, data storage, and cybersecurity - are exempt from liability under this provision
  • Small-businesses with fewer than 10 million users in 3 of the past 12 months
Impose new requirements for invoking Section 230:

Intention: Impose additional requirements on companies that want to invoke Section 230

Proposed legislation: Platform Accountability and Consumer Transparency (PACT) Act (Sponsors: Sen. Brian Schatz (D-HI), Sen. John Thune (R-SD))

  • To receive Section 230 immunity, platforms would be required to publish an acceptable use policy that details the types of content the platform allows, explain how policies will be enforced, and describe how users can report policy-violating or illegal content 
  • Platforms would be required to establish call centers with a live representative to assist users with the process of filing good-faith complaints eight hours per day, five days per week; provide an email address to which users can submit complaints; and create an easy-to-use complaint filing system that would allow users to file and track complaints and appeals. Platforms would be permitted to filter complaints for spam, trolls, and abusive complaints as user complaints would be required to be in good faith.
  • Platforms would be required to review and remove illegal and/or policy-violating content in a timely manner to receive Section 230 protections. 
  • Platforms would then be required to notify users that the provider had removed the content, give an explanation, and allow the user the opportunity to appeal the decision.
  • Providers must also issue biannual transparency reports, which would include the number of content-related complaints filed by users, the number of times the provider acted upon those complaints and the method of enforcement, and the number of appeals filed by users.

The following would be exempt / have reduced requirements:

  • Small businesses with fewer than 1M unique monthly visitors and accrued revenue of $50M or less don’t need a live call center and have softer time constraints for processing complaints
  • Independent blogger with fewer than 100K unique monthly visitors and accrued revenue of $1M or less have minimal requirements under this act; just need a way for users to alert them about content on their site and remove it in a timely fashion
  • Internet infrastructure companies are fully exempt

Clearly there are many directions to go in with how Section 230 can be revamped or replaced. In our opinion, limiting the scope of Section 230 protections and potentially introducing additional requirements to benefit from those protections is the way to go. Section 230 has enabled many aspects of the internet and those should continue to be protected, while adjusting the legislation to mitigate new issues as they arise. What do you think is the best path forward? Feel free to share your thoughts with us via email, LinkedIn, or Twitter!

Join waitlist to become an early Privee member

Thank you! You've been added to the waitlist.
Oops! Something went wrong while submitting the form.