Future Tense

All the Ways Congress Wants to Change Section 230

Republicans and Democrats alike want to change Section 230 of the Communications Decency Act. Here’s a comprehensive list of the proposed legislation so far.

A pile of papers viewed from the side.
Photo illustration by Slate. Photo by Getty Images Plus.

In partnership with Lawfare and the Center on Technology Policy at UNC-Chapel Hill, Future Tense is tracking all of the proposals to reform Section 230, a law that is both a bedrock of the modern internet and a constant source of criticism from both the left and the right. This Section 230 Reform Hub includes information on each bill that has been introduced in Congress to reform Section 230 since 2020. The legislative summaries include the date the legislation was introduced, co-sponsors, status, a short overview of the substance of the legislation, a description of the type of reform that is proposed, and a link to the full text. It includes all legislation in the last Congress, as well as ongoing tracking of legislation introduced in the current one. For more information, read this introductory essay by Matt Perault from when the project first launched in March 2021.

Categories:

• Repeal: Bills that repeal Section 230 in whole
• Limiting the Scope: Bills that restrict the types of activities protected by Section 230. These bills would prohibit companies from using Section 230 as a defense under certain conditions, such as in cases of child sexual exploitation or civil rights violations.
• Imposing New Obligations: Bills that impose new obligations (such as a duty of care or quid pro quo requirements) on companies that wish to use Section 230 as a defense
• Good Samaritan: Bills that alter the “Good Samaritan” portion of Section 230. This includes bills that attempt to address perceived political bias or censorship.

All bills introduced in the 116th and 117th congressional sessions are no longer pending. As you can see, some of those bills have been reintroduced for consideration in the 118th congressional session (2023-2024).

This list was updated on Sep. 19, 2023.

Bill name: H.R. 4910 - Deplatform Drug Dealers Act
Sponsor: Rep. Brett Guthrie (R-Kentucky)
Date introduced: July 26, 2023
Status: Referred to the House Committee on Energy and Commerce
Category: Limiting the Scope
Summary: This bill would amend Section 230 to remove the immunity—with the exception of Section 230(c)(2)(A)—for civil suits arising from advertisements or offers to sell, distribute, deliver, or dispense controlled substances or drugs. As a result, civil actions in violation of the Controlled Substances Act (21 U.S.C. § 802) or the Federal Food, Drug, and Cosmetic Act (21 U.S.C. § 321) would not be protected by Section 230.

Bill name: H.R. 4887- Online Consumer Protection Act
Sponsor: Rep. Janice Schakowsky (D-Illinois)
Date introduced: July 25, 2023
Status: Referred to the Committee on Energy and Commerce
Category: Imposing New Obligations
Summary: This bill would remove Section 230 protections for platforms that fail to implement a series of consumer protection measures required by the law. For instance, the law imposes new obligations on platforms to establish terms of service that convey their specific content moderation policies, adopt content moderation policies consistent with their terms of service, and institute processes for users to appeal content moderation decisions. This bill would also require that platforms with more than 10,000 monthly users submit annual filings to the Federal Trade Commission to demonstrate their compliance with these requirements. Platforms in violation of this law would be subject to potential enforcement action from the FTC under the agency’s authority to regulate unfair or deceptive acts or practices, and the bill would remove any Section 230 barrier to these enforcement actions. The bill would also create a private right of action for users to sue over violations of the law, as well as a right of action for state attorneys general.

Bill name: S.2264 - Fentanyl Trafficking Prevention Act
Sponsor: Sen. Jon Ossoff (D-Georgia)
Date introduced: July 12, 2023
Status: Referred to the Committee on the Judiciary
Category: Limiting the Scope
Summary: This bill creates a new civil cause of action against covered providers under the Controlled Substances Act (21 U.S.C. §841) and removes Section 230(c)(1) protections for such lawsuits. Platforms would be subject to civil suits by the federal government for the knowing, intentional, or reckless facilitation of a Controlled Substances Act violation. The bill expressly does not change liability protection under Section 230(c)(2), does not not impose any duty for a provider to monitor users or subscribers, and does not impose any duty for providers to screen for violations of the Controlled Substances Act. A provider’s use of end-to-end encryption would not inherently give rise to liability, even where that provider does not have the capability to decrypt a communication between its users.

The bill defines “covered providers” as any interactive computer service with more than 50 million active monthly users in the United States or more than 100 million active monthly users worldwide.

Bill name: S.1993 - A Bill to Waive Immunity Under Section 230 of the Communications Act of 1934 for Claims and Charges Related to Generative Artificial Intelligence
Sponsor: Sen. Josh Hawley (R-Missouri)
Co-sponsors: Sen. Richard Blumenthal (D-Connecticut)
Date introduced: June 14, 2023
Status: Read twice and referred to the Committee on Commerce, Science, and Transportation
Category:  Limiting the Scope
Summary: This bill would amend Section 230 to clarify that online platforms will not receive immunity under Section 230(c)(1) for civil claims and criminal charges under both federal and state law that involve the use or provision of generative artificial intelligence. Platforms using generative A.I. products would retain immunity under Section 230(c)(2)(A), the “Good Samaritan” provision of the statute. The bill defines generative artificial intelligence as an A.I. system that is “capable of generating novel text, video, images, audio, and other media based on prompts or other forms of data provided by a person.”

Bill name: Curtailing Online Limitations That Lead Unconstitutionally to Democracy’s Erosion (COLLUDE) Act
Sponsor: Sen. Eric Schmitt (R-Missouri)
Co-sponsors: Sen. Mike Braun (R-Indiana)
Date introduced: May 10, 2023
Status: Referred to the Committee on Commerce, Science, and Transportation
Category: Limiting the Scope
Summary: This bill would amend Section 230 by removing protection for online platforms that perform certain types of moderation actions on behalf of a government entity or its agents. Online platforms would not have Section 230 protections if they “reasonably appear to express, promote, limit the visibility of, or suppress legitimate political speech, including a discernable viewpoint.” The only exceptions would be if such speech was moderated for a legitimate law enforcement or national security purpose.

This bill would also mandate that online platforms invoking Section 230 in criminal and civil actions use the statute as an affirmative defense, rather than at the motion-to-dismiss stage—essentially, shifting the burden to the platforms to prove that they are not information content providers.

Bill name: Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment (STOP CSAM) Act of 2023
Sponsor: Sen. Richard Durbin (D-Illinois)
Co-sponsors: Sens. Josh Hawley (R-Missouri), Ted Cruz (R-Texas), Chuck Grassley (R-Iowa)
Date introduced: April 19, 2023
Status: Placed on Senate legislative calendar
Category: Imposing New Obligations
Summary: This bill aims to combat child sexual abuse material, or CSAM, by imposing new reporting requirements and obligations for interactive computer services, with the broader goal of “promoting accountability and transparency by the tech industry.”

The bill specifies that any interactive computer service that “knowingly … host[s] or store[s]” CSAM or makes CSAM “available to any person” and does not remove or attempt to remove the material in a timely way is liable to fines of up to $1 million, or up to $5 million if the platform’s actions involve “a conscious or reckless risk of personal injury.” It creates additional federal criminal liability when platforms “knowingly promote or facilitate” crimes concerning CSAM. The bill also expands potential civil remedies for victims of CSAM, in cases of “intentional, knowing, or reckless” action by an interactive computer service relating to the “promotion or facilitation” or “hosting or storing … or making available” of CSAM. As it relates to Section 230, the bill specifically states that no element of Section 230 “shall be construed to impair or limit any claim [for civil remedies].”

Bill name: Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act of 2023 (Senate Version)
Sponsor: Sen. Lindsey Graham (R-South Carolina)
Co-sponsors: Sens. Richard Blumenthal (D-Connecticut), Dick Durbin (D-Illinois), Chuck Grassley (R-Iowa), Dianne Feinstein (D-California), John Cornyn (R-Texas), Sheldon Whitehouse (D-Rhode Island), Josh Hawley (R-Missouri), Mazie Hirono (D-Hawaii), John Kennedy (R-Louisiana), Bob Casey (D-Pennsylvania), Marsha Blackburn (R-Tennessee), Catherine Cortez Masto (D-Nevada), Susan Collins (R-Maine), Margaret Hassan (D-New Hampshire), Joni Ernst (R-Iowa), Mark Warner (D-Virginia), Cindy Hyde-Smith (R-Mississippi), Lisa Murkowski (R-Alaska), Rob Portman (R-Ohio)
Date introduced: April 19, 2023
Status: Referred to the Committee on the Judiciary
Category: Limiting the Scope
Summary: Reintroduced from the 116th and 117th Congresses, and almost identical to a House bill of the same name, the EARN IT Act of 2023 would amend Section 230 so that “interactive computer services”—such as platforms, web hosts, and other online providers that support content—cannot use Section 230 as a defense in state criminal cases and federal and state civil cases regarding the proliferation of child sexual abuse material. This is the fourth version of the EARN IT Act to be introduced in the Senate. It was first introduced in March 2020 and then significantly amended in July 2020.

This bill is virtually unchanged from the previous Senate version (EARN IT Act of 2022). It allows courts to consider platforms’ use of encryption as evidence of liability in online child sexual exploitation cases, but the platforms’ use of encryption or inability to decrypt communications does not serve as an independent basis for liability.

In a difference from the House version of this bill, the Senate version would also create a National Commission on Online Child Sexual Exploitation Prevention to develop best practices for platforms to respond to the online sexual exploitation of children. The commission would be composed of 19 members, including the attorney general, secretary of homeland security, and chair of the FTC, plus 16 other members appointed in equal parts by minority and majority leaders of the House and Senate. The commission would include representatives from law enforcement, civil society, and industry.

Bill name: Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act of 2023 (House Version)
Sponsor: Rep. Sylvia Garcia (D-Texas)
Co-sponsors: Rep. Ann Wagner (R-Missouri)
Date introduced: April 19, 2023
Status: Referred to the Committee on Energy and Commerce and the Committee on the Judiciary to consider provisions that fall within the jurisdiction of each committee
Category: Limiting the Scope
Summary: Reintroduced from the 116th and 117th Congresses, and identical to a Senate bill of the same name, the EARN IT Act of 2023 would amend Section 230 so that “interactive computer services”—such as platforms, web hosts, and other online providers that support content—cannot use Section 230 as a defense in state criminal cases and federal and state civil cases regarding the proliferation of child sexual abuse material. This is the fourth version of the EARN IT Act to be introduced in the House. It was first introduced in March 2020 and then significantly amended in July 2020.

It allows courts to consider platforms’ use of encryption as evidence of liability in online child sexual exploitation cases, but the platforms’ use of encryption or inability to decrypt communications does not serve as an independent basis for liability. This bill is virtually unchanged from the previous House version (EARN IT Act of 2022), except that it removes the language calling for the creation of a 19-member National Commission on Online Child Sexual Exploitation Prevention.

Bill name: H.R.2635 - The Big-Tech Accountability Act of 2023
Sponsor: Rep. George Santos (R-New York)
Date introduced: April 17, 2023
Status: Referred to the Committee on Energy and Commerce
Category: Limiting the Scope
Summary: This bill would remove Section 230 protections for social media services altogether, and make it unlawful for social media services to perform certain types of content moderation. Under the bill, it would be unlawful for any social media service to suspend or otherwise “de-platform” a citizen of the United States on the basis of their “social, political, or religious status,” regardless of the user’s violations of a service’s policies on hate speech, sexual harassment, violent threats, and discrimination. Social media services in violation of the bill would be subject to a $5,000 penalty for each day that the U.S. citizen-user is de-platformed.

The bill defines a “social media service” as any interactive computer service which hosts content for the purpose of “facilitating public or widespread interaction” with such content, is a platform “meant for public discourse,” or which “otherwise hosts publicly accessible information or content, public or widespread interaction, and content distribution.” The bill specifically identifies Meta, Facebook, Instagram, Twitter, and TikTok as social media services.

Bill name: Disincentivizing Internet Service Censorship of Online Users and Restrictions on Speech and Expression (DISCOURSE) Act
Sponsor: Sen. Marco Rubio (R-Florida)
Co-Sponsors: Sen. Mike Braun (R-Indiana)
Date introduced: March 22, 2023
Status: Referred to the Committee on Commerce, Science, and Transportation
Category: Good Samaritan, Limiting the Scope
Summary: This bill is an updated version of the DISCOURSE Act, originally introduced in 2021 during the 117th Congress. This bill would amend the Good Samaritan provision so that platforms would receive liability protections only when they moderate content that is obscene, lewd, lascivious, filthy, excessively violent, harassing, or unlawful or when it promotes terrorism, violent extremism, or self-harm. The determination of whether content fits into those categories must be based on an “objectively reasonable belief.” The bill would also add a religious liberties clause that would make it more difficult for platforms to receive Section 230 protections when they moderate content in a way that “burdens” religious exercise.

In addition, the bill would classify platforms with a “dominant market share” as “information content providers” if they 1) use algorithms to target third-party content, 2) promote or suppress a “discernible viewpoint,” or 3) solicit, comment on, fund, contribute to, or modify information provided by another person. This change would eliminate Section 230 protections for dominant platforms in these circumstances. The act also prevents dominant platforms from using Section 230 as a defense if they fail to notify customers of available parental control protections, as required currently by Section 230.

Bill name: S.941 - Removing Section 230 Immunity for Official Accounts of Censoring Foreign Adversaries Act
Sponsor: Sen. Marco Rubio (R-Florida)
Co-sponsors: Sen. Mike Braun (R-Indiana)
Date introduced: March 22, 2023
Status: Referred to the Committee on Commerce, Science, and Transportation
Category: Limiting the Scope
Summary: This bill would remove Section 230 protections for covered social media platforms—U.S.-based social media platforms with 50 million or more monthly U.S. users—that “knowingly host, distribute, or actively display” content from accounts associated with foreign governments that restrict access to covered social media platforms (the bill classifies those governments as “censoring foreign adversaries”). Platforms are deemed to have knowledge of such material if they publicly identify the account in question as associated with the organization or government in question (e.g., with a verification badge). In defining “censoring foreign adversaries,” the bill includes China, Russia, North Korea, Iran, Cuba, Syria, and Venezuela, and tasks the Secretary of State with identifying potential additions to the list.

Bill name: Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms (SAFE TECH) Act
Sponsor: Sen. Mark R. Warner (D-Virginia)
Date introduced: Feb. 28, 2023
Status: Referred to the Committee on Commerce, Science, and Transportation
Category: Limiting the Scope
Summary: This is a reintroduction of H.R. 3421, a bill initially introduced by Sens. Mark Warner (D-Virginia), Mazie Hirono (D-Hawaii), and Amy Klobuchar (D-Minnesota) in February 2022. Under the SAFE TECH Act, companies would not be able to use Section 230 as a defense in cases related to ads or other content they are paid to make available. The act would also prevent a company from using Section 230 to avoid injunctive relief (e.g., a court order compelling the company to take some action) if it fails to “remove, restrict access to or availability of, or prevent dissemination of material” that could cause “irreparable harm.” The act would prevent platforms from using Section 230 as a defense in cases brought under several different federal and state laws, including civil rights, antitrust, stalking and harassment, human rights, and wrongful death laws. The SAFE TECH Act also puts the burden on companies that wish to use Section 230 as a defense to prove that they are “a provider or user of an interactive computer service,” and that they are “being treated as the publisher or speaker of speech provided by another information content provider.”

Bill name: Internet Platform Accountability and Consumer Transparency (Internet PACT) Act
Sponsor: Sen. Brian Schatz (D-Hawaii)
Date introduced: Feb. 16, 2023 
Status: Referred to the Committee on Commerce, Science, and Transportation
Category: Imposing New Obligations; Limiting the Scope
Summary: This bill is a reintroduction of the PACT Act that was originally submitted in the 2019-2020 congressional session and then revised for the 2021-2022 session. Under the PACT Act, to receive Section 230 immunity, interactive computer service providers would be required to publish an acceptable use policy that would detail the types of content the provider allows, explain how the provider enforces its content policies, and describe how users can report policy-violating or illegal content.

The PACT Act would also require providers to establish call centers that are open eight hours per day, five days per week with a live representative to assist users with the process of filing good-faith complaints; to provide an email address for user complaints; and to create an easy-to-use complaint filing system that would allow users to file and track complaints and appeals. Because the act would require users to make complaints in good faith, providers would be permitted to filter complaints for spam, trolls, and other bad-faith attempts to abuse the system.

The PACT Act would also require that a provider review and remove illegal and/or policy-violating content in a timely manner to receive Section 230 protections. Providers would be required to remove illegal content (as determined by a court) within four days of being put on notice of the illegal content and to initiate removal action for content that violates the provider’s publicized acceptable use policy within 14 days of receiving the complaint. Platforms would then be required to notify users that they had removed the content, give an explanation, and allow the user the opportunity to appeal the decision.

Providers must also issue biannual transparency reports, which would include the number of content-related complaints filed by users, the number of times the provider acted upon those complaints and the method of enforcement, and the number of appeals filed by users.

Under the PACT Act, small businesses that receive fewer than 1 million unique monthly visitors and have an accrued revenue of $50 million or less are exempt from the live call center requirement and have softened time constraints related to processing complaints. Small businesses would be allowed to process complaints of illegal and/or policy-violating content within four and 21 days, respectively. Individual providers—like independent bloggers—who have fewer than 100,000 unique monthly visitors and an accrued revenue less than $1 million have minimal requirements under the act. They are only required to provide users with a contact system to alert the provider about content on their site, and they must remove illegal content within four days of notice. The PACT Act also exempts internet infrastructure companies from all provisions of the bill described above. The updated bill enumerates exceptions to the appeals process, including injunctions and other court orders.

Bill name: See Something, Say Something Online Act of 2023
Sponsor: Sen. Joe Manchin (D-West Virginia)
Co-sponsors: Sen. Joe Cornyn (R-Texas)
Date introduced: Jan. 30, 2022
Status: Referred to the Committee on Commerce, Science, and Transportation
Category: Imposing New Obligations
Summary: This bill is an updated version of the See Something, Say Something Online Acts of 2021 and 2020, originally introduced in the 117th and 116th Congress, respectively. The See Something, Say Something Online Act of 2023 would require interactive computer services to report to the Department of Justice “suspicious transmissions” that show individuals or groups planning, committing, promoting, and facilitating terrorism, serious drug offenses, and violent crimes. The bill only requires providers to report suspicious transmissions they detect; it would not require them to scan all content on their sites to identify these transmissions. Providers would have to take “reasonable steps” to prevent and address such suspicious transmissions. If a provider fails to report a suspicious transmission that it should have reasonably been aware of, it would not be able to use Section 230 as a defense and could be held liable as a publisher for the suspicious transmission.

Bill name: Curbing Abuse and Saving Expression In Technology (CASE-IT) Act
Sponsor: Rep. Gregory Steube (R-Florida)
Date introduced: Jan. 26, 2023 
Status: Referred to the House Committee on Energy and Commerce
Category: Limiting the Scope
Summary: This bill is an updated version of the CASE-IT Act introduced in the 2019-2020 congressional session and reintroduced in the 2020-2021 session. The CASE-IT Act would prevent a company from using Section 230 as a defense for a period of one year if that company “creates, develops, posts, materially contributes to, or induces another person to create, develop, post, or materially contribute to illegal online content.” The CASE-IT Act would also condition Section 230 immunity upon content moderation policies that are consistent with the First Amendment. The act also creates a private right of action that would allow users to bring a civil action against companies that fail to follow content moderation policies that are consistent with the First Amendment.

Bill name: Platform Accountability and Transparency Act
Sponsor: Sen. Christopher Coons (D-Delaware)
Co-sponsors: Sens. Amy Klobuchar (D-Minnesota), Bill Cassidy (R-Louisiana)
Date introduced: Dec. 27, 2022
Status: Referred to the Senate Committee on Health, Education, Labor, and Pensions
Category: Limiting the Scope, Imposing New Obligations
Summary: This bill seeks to encourage and broaden transparency across digital communication platforms by creating requirements for companies to share qualified data with qualified researchers. The bill defines a qualified researcher as one “affiliated with a [U.S.] university or a [U.S.] nonprofit organization.” The National Science Foundation would determine which data is qualified as part of reviewing a researcher’s application to the NSF and the Federal Trade Commission. If data is deemed qualified, then “a platform may not restrict or terminate a qualified researcher’s access” to it. Researchers affiliated with law enforcement or intelligence agencies may not participate in this process, and government entities are forbidden from “seek[ing] access” to data obtained by qualified researchers.

This bill amends Section 230 by removing an interactive computer service’s immunity from liability if it “failed to provide qualified data and information pursuant to a qualified research project.” Under these conditions, this bill would therefore obligate social media companies to provide the data requested by qualified researchers or risk losing their Section 230 protections. This exemption to Section 230’s liability shield, however, would apply only if the platform’s “failure to comply was a direct and substantial contributor” to the allegations of harm made by the plaintiff.

Bill name: Platform Integrity Act
Sponsor: Rep. David Cicilline (D-Rhode Island)
Co-sponsors: None
Date introduced: Dec. 27, 2022
Status: Referred to the House Committee on Energy and Commerce
Category: Limiting the Scope
Summary: The act would amend Section 230(c)(1), removing protections any time an interactive computer service has “promoted, suggested, amplified, or otherwise recommended” content.

Bill name: The 2023 National Defense Authorization Act Tiitle LIX, Subtitle D: Judicial Privacy and Security
Sponsor: Rep. Mikie Sherrill (D-New Jersey)
Co-sponsors: 123 members of the House in 2021 bill
Date introduced: Sept. 13, 2022
Status: Passed in House and Senate
Category:  Imposing New Obligations
Summary: This proposal—included as an amendment to the 2023 NDAA is based off a previous piece of legislation introduced in the House (H.R. 4436). This amendment would make it unlawful for any person, business, or association, once notified, to make public via the internet “covered” personal information of federal judges or their family members.

This requirement to remove covered information would also apply retroactively, requiring businesses to remove existing covered information from public visibility within 72 hours of a formal request. Tech companies would be required to “implement and maintain reasonable security procedures” to ensure their compliance with the aforementioned provisions.

If the covered information of judges and their families were revealed in violation of these requirements,companies could be held liable for “an amount equal to actual damages sustained” by judges and their families and for the plaintiff’s legal fees. This language creates new potential liabilities for platforms and tech companies that differ from current law.

An earlier version of the amendment stated that “Nothing in this subtitle shall be construed … to impose liability on an interactive computer service in a manner that is inconsistent with the provisions of section 230,” but only if the service complies with a request to remove content under the amendment. The final version does not explicitly mention Section 230.

Bill name: Preventing Rampant Online Technological Exploitation and Criminal Trafficking (PROTECT) Act of 2022
Sponsor: Rep. Mike Lee (R-Utah)
Co-sponsors: None
Date introduced: Sept. 28, 2022
Status: Referred to the Senate Committee on the Judiciary
Category: Imposing New Obligations
Summary: The act would require online platforms that make pornographic imagery available to the public (“covered platforms”) to verify the identity and age of users prior to allowing such content to be uploaded. In addition, the operator of a covered platform would be required to verify through use of a standardized government form that the individuals who appear in the pornographic content were above the age of 18 at the time of its creation, consented to the acts photographed or recorded, and consented to the distribution of the material.

The act explicitly states that it should not be construed to affect Section 230, but it does impose penalties that would contradict current Section 230 jurisprudence. Covered platforms that fail to verify the age and identity of uploading users, and/or the age and consent of those depicted, would be criminally and civilly liable under federal law. The act would also impose civil penalties on platforms that fail to establish and advertise a mechanism to flag child sexual abuse material and nonconsensual content. Finally, the act would impose civil penalties for failure to remove and block re-uploads of flagged content in a timely manner.

Bill Name: HR8612 Stop the Censorship Act 
Sponsor: Rep. Paul Gosar (R-Arizona)
Co-sponsors: Reps. Lauren Boebert (R-Colorado), Matt Gaetz (R-Florida), Bob Good (R-Virginia,Glenn Grothman (R-Wisconsin), Troy E. Nehls (R-Texas), Ralph Norman (R-South Carolina), Thomas P. Tiffany (R-Wisconsin), Mary E. Miller (R-Illinois), Andy Biggs (R-Arizona)
Date introduced: July 29, 2022
Status: Referred to the House Committee on Energy and Commerce
Category: Good Samaritan
Summary: The bill—a close parallel to one by the same name (HR 7808), introduced by Gosar in 2020—would amend Section 230 by changing its definition of protected Good Samaritan moderation practices in section (c)(2). Social media platforms would be able to continue to use Section 230(c)(2) as a defense only in cases in which they moderate unlawful material, a change from the more expansive “unlawful, or that promotes violence or terrorism” language in the 2020 bill. Platforms would lose this protection for moderating offensive material that is otherwise legal. The bill would add a provision to Section 230 explicitly preserving liability protection for services that enable users to choose whether to restrict material, including lawful content. Platforms would lose those protections l, however, if they restricted lawful content for all users.

Bill Name: 21st Century Free Speech Act
Sponsor: Rep. Majorie Taylor (R-Georgia)
Co-sponsors: None
Date introduced: April 28, 2022
Status: Referred to the House Committee on Energy and Commerce
Category: Repeal
Summary: The bill is a companion bill to S.1384, introduced by Sen. Bill Hagerty (R-Tennessee) in 2021. It would repeal Section 230 in its entirety, and replace it with the proposed Section 232. Section 232 calls for “reasonable, non-discriminatory” access to online platforms and characterizes interactive computer services with more than 10 million worldwide monthly users as common carrier technology companies, akin to railroad and telephones. Interactive computer services falling under the bill’s definition of a common carrier would have to disclose their rules regarding content moderation and other administrative actions. The bill creates a private right of action for users so that they may bring civil action against an interactive computer service or user that violates this act , as well as a right of action for state attorneys general suing on behalf of their states’ residents. Additionally, no interactive computer services would be able to use Section 232 as a defense when they engage in certain moderation techniques, including downranking content, removing content, and IP-blocking content. The bill narrows the Good Samaritan provision by striking the term “otherwise objectionable,” but platforms would retain immunity for removing certain types of content defined specifically in the bill, such as harassment and material promoting terrorism.

Bill name: The Accountability for Online Firearms Marketplaces Act of 2021
Sponsor: Rep. Jason Crow (D-Colorado)
Co-sponsors: Reps. Hakeem Jeffries (D-New York), Katie Porter (D-California), Haley Stevens (D-Michigan)
Date introduced: May 18, 2022
Status: Referred to the House Committee on Energy and Commerce
Category:  Limiting the Scope
Summary: This bill is the House companion to S. 2725, which was introduced in 2021 by Sens. Richard Blumenthal (D-Connecticut), Dianne Feinstein (D-California), and Sheldon Whitehouse (D-Rhode Island). It removes Section 230 protections for online firearms marketplaces. Online firearms marketplaces are defined as interactive computer services that facilitate transactions involving firearms and related equipment, advertise or make available any posting or listing proposing the transfer of firearms, or make digital instructions that can automatically program a 3D printer to produce a firearm. Even if these activities are prohibited by a service’s terms of use, the service could still be considered an “online firearms marketplace.” In a prefatory statement, the legislation argues that courts’ interpretation of “Section 230 as providing sweeping immunity for a broad array of providers, including providers alleged to have facilitated violations of criminal laws online” is “overly broad and discourages the self-policing that Section 230 intended to incentivize.”

Bill name: Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act of 2022 (House Version)
Sponsor: Rep. Sylvia Garcia (D-Texas)
Co-sponsors: Rep. Ann Wagner (R-Missouri)
Date introduced: Feb. 1, 2022
Status: Referred to the Committee on Energy and Commerce, Committee on the Judiciary, and Committee on Education and Labor for consideration of provisions that fall within the jurisdiction of each committee
Category: Limiting the Scope
Summary: Reintroduced from the 116th Congress and identical to a Senate bill of the same name, the EARN IT Act of 2022would amend Section 230 so that “interactive computer services”—such as platforms, web hosts, and other online providers that support content— cannot use Section 230 as a defense in state criminal cases and federal and state civil cases regarding the proliferation of child sexual abuse material. This is the second version of the EARN IT Act to be introduced in the House. The first version was introduced in October 2020. Companion bills have also been introduced in the Senate, including the identical EARN IT Act of 2022, which was introduced by Sen. Graham.

Like the previous House version of the bill, this bill would allow courts to consider platforms’ use of encryption or inability to decrypt communications as evidence of liability in online child sexual expolitation cases as long as the use of encryption or inability to decrypt communications does not serve as an independent basis for liability. The bill would also create a National Commission on Online Child Sexual Exploitation Prevention to develop best practices for platforms to respond to the online sexual exploitation of children. The Commission would be composed of 19 members, with the attorney general, secretary of homeland security, and chair of the FTC as agency heads, plus16 other members appointed in equal parts by minority and majority leaders of the House and Senate. The commission would represent law enforcement, civil society, and industry by including: four members with experience investigating online child sexual exploitation crimes as law enforcement officers or prosecutors; four member who are survivors of online child exploitation or have experience providing services for victims of online child sexual exploitation in a nongovernmental capacity; two members with experience in consumer protection or civil liberties; two members with experience in computer science or software engineering in a nongovernmental capacity; two members from interactive computer services with at least 30 million monthly users in the U.S.; and two members from interactive computer services with fewer than 30 million monthly users.

Bill name: Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act of 2022 (Senate Version)
Sponsor: Sen. Lindsey Graham (R–South Carolina)
Co-sponsors: Sens. Richard Blumenthal (D-Connecticut), Dick Durbin (D-Illinois), Chuck Grassley (R-Iowa), Dianne Feinstein (D-California), John Cornyn (R-Texas), Sheldon Whitehouse (D–Rhode Island), Josh Hawley (R-Missouri), Mazie Hirono (D-Hawaii), John Kennedy (R-Louisiana), Bob Casey (D-Pennsylvania), Marsha Blackburn (R-Tennessee), Catherine Cortez Masto (D-Nevada), Susan Collins (R-Maine), Margaret Hassan (D-New Hampshire), Joni Ernst (R-Iowa), Mark Warner (D-Virginia), Cindy Hyde-Smith (R-Mississippi), Lisa Murkowski (R-Alaska), Rob Portman (R-Ohio)
Date introduced: Jan. 31, 2022
Status: Referred to the Committee on the Judiciary
Category: Limiting the Scope
Summary: Reintroduced from the 116th Congress and identical to a House bill of the same name, the EARN IT Act of 2022 would amend Section 230 so that “interactive computer services”—such as platforms, web hosts, and other online providers that support content—cannot use Section 230 as a defense in state criminal cases and federal and state civil cases regarding the proliferation of child sexual abuse material. This is the third version of the EARN IT Act to be introduced in the Senate. It was first introduced in March 2020 and then significantly amended in July 2020. Companion bills have also been introduced in the House, including the identical EARN IT Act of 2022, which was introduced by Rep. Sylvia Garcia.

This bill differs from the previous Senate version because it allows courts to consider platforms’ use of encryption as evidence of liability in online child sexual epolitation cases, though the use of encryption or inability to decrypt communications does not serve as an independent basis for liability. The previous Senate version of the bill, amended in July 2020, noted that providers that use end-to-end encryption or are unable to decrypt communications would not face liability purely “because” these cybersecurity protections are built into the platform. The 2020 House version of the bill included the same language as the current bill in both chambers of Congress.

The bill would also create a National Commission on Online Child Sexual Exploitation Prevention to develop best practices for platforms to respond to the online sexual exploitation of children. The commission would be composed of 19 members, with the attorney general, secretary of homeland security, and chair of the FTC as agency heads, plus 16 other members appointed in equal parts by minority and majority leaders of the House and Senate. The Commission would include representatives from law enforcement, civil society, and industry.

Bill name: Justice Against Malicious Algorithms Act of 2021
Sponsors: Rep. Frank Pallone Jr. (D-New Jersey)
Co-sponsors: Reps. Mike Doyle (D-Pennsylvania), Jan Schakowsky (D-Illinois), and Anna Eshoo (D-California)
Date introduced: Oct. 15, 2021
Status: Referred to the House Committee on Energy and Commerce
Category: Limiting the Scope
Summary: This bill lifts the Section 230 liability shield for internet platform companies that know or ought to have known that a personalized recommendation of third-party information made via algorithm or other technology “materially contributed to a physical or severe emotional injury to any person.” The proposed legislation hones in on personalized recommendations, which are defined as “material enhancement, using a personalized algorithm, of the prominence of such information with respect to other information.” It does not apply to recommendations made in response to a “user-specified” search, to platforms with fewer than 5 million monthly visitors, or to internet infrastructure services, such as web-hosting sites and data storage platforms.

Bill Name: A bill to repeal Section 230 of the Communications Act of 1934
Sponsor: Sen. Lindsey Graham (R–South Carolina)
Co-sponsors: Sens. Josh Hawley, (R-Missouri), Marsha Blackburn (R-Tennessee)
Date introduced: Oct. 7, 2021
Status: Referred to the Senate Committee on Commerce, Science, and Transportation
Category: Repeal
Summary: The bill would repeal Section 230 of the Communications Act of 1934 in its entirety. Graham also introduced this bill in the 116th Congress as S.5020.

Bill name: Federal Big Tech Tort Act
Sponsor: Rep. Lance Gooden (R-Texas)
Date introduced: Sept. 30
Status: Referred to the House Committee on the Judiciary
Category: Limiting the Scope
Summary: Although the text of this bill does not explicitly claim to amend Section 230, it would limit the scope of Section 230 immunity for online platforms by establishing a federal tort against social media companies that cause “bodily injury to children or harm the mental health of children.” Social media companies would be liable to any individual under the age of 16 who suffers bodily harm or injury to mental health while using their platforms, whether that harm was due in whole or in part to use of the service. Social media companies would be able to claim an affirmative defense by introducing evidence that they took reasonable steps to ascertain the age of each user on its service, and that it did not know and had no reason to know that the user was under 16 years in age.

Bill name: The Accountability for Online Firearms Marketplaces Act of 2021
Sponsor: Sen. Richard Blumenthal (D-Connecticut)
Co-sponsors: Sens. Dianne Feinstein (D-California), Sheldon Whitehouse (D-Rhode Island)
Date introduced: Sept. 13
Status: Referred to the Committee on Commerce, Science, and Transportation
Category:  Limiting the Scope
Summary: This bill seeks to remove Section 230 protections for online firearms marketplaces. Online firearms marketplaces are defined as an “interactive computer service” that 1) facilitate transactions involving firearms and related equipment, 2) advertise or make available any posting or listing proposing the transfer of firearms, or 3) make digital instructions that can automatically program a 3D printer to produce a firearm. This bill would apply even if the above activities were prohibited by the service’s terms of use.

Bill name: Health Misinformation Act of 2021
Sponsor: Sen. Amy Klobuchar (D-Minnesota)
Co-sponsor: Sen. Ben Luján (D-New Mexico)
Date introduced: July 22
Status: Referred to the Committee on Commerce, Science, and Transportation
Category: Limiting the Scope
Summary: This bill would create an exception to Section 230 liability protections for platforms that use algorithms to promote health misinformation. This exemption would not apply if platforms use algorithmic methods that are “neutral,” such as ordering content chronologically. The bill would take effect for the remainder of the public health emergency and would require the secretary of health and human services to issue guidance regarding the definition of health misinformation within 30 days after the enactment of the bill.

Bill name: Preserving Political Speech Online Act
Sponsor: Sen. Steve Daines (R-Montana)
Co-sponsors: 
Date introduced: July 14
Status: Referred to the Committee on Commerce, Science, and Transportation
Category: Good Samaritan
Summary: This act would amend Section 230 by limiting liability protections under the Good Samaritan provision. Currently, platforms receive Section 230 protections only when they remove content “in good faith” that they consider to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable. Under the Preserving Political Speech Online Act, acceptable reasons for “good faith” removal would be limited to content that is obscene, illegal, or excessively violent. This bill also introduces the idea of “bad faith” moderation, which it defines as blocking content on the grounds of race, religion, sex, national origin, or political affiliation or speech. Services are exempt from this “bad faith” clause if they are “dedicated to a specific set of issues, policies, beliefs, or viewpoints.”

Unrelated to Section 230, this act also includes several requirements for political advertising on platforms.

Bill name: The Disincentivizing Internet Service Censorship of Online Users and Restrictions on Speech and Expression Act (DISCOURSE)
Sponsor: Sen. Marco Rubio (R-Florida)
Co-sponsors: 
Date introduced: June 24
Status: Referred to the Committee on Commerce, Science, and Transportation
Category:  Good Samaritan, Limiting the Scope
Summary: This bill would amend the Good Samaritan provision so that platforms would receive liability protections only when they moderate content that is obscene, lewd, lascivious, filthy, excessively violent, harassing, or unlawful or when it promotesterrorism,violent extremism, or self-harm. The determination of whether content fits into those categories must be based on an “objectively reasonable belief.” The bill would also add a religious liberties clause that would make it more difficult for platforms to receive Section 230 protections when they moderate content in a way that “burdens” religious exercise.

In addition, the bill would classify  platforms with a “dominant market share” as “information content providers” if they 1) use algorithms to target third-party content, 2) promote or suppress a “discernible  viewpoint,” or 3) solicit, comment on, fund, contribute to, or modify information provided by another person. This change would eliminate Section 230 protections for dominant platforms in these circumstances. The act also prevents dominant platforms from using Section 230 as a defense if they fail to notify customers of available parental control protections, as required currently by Section 230.

Bill name: Protect Speech Act
Sponsor: Rep. Jim Jordan (R-Ohio)
Co-sponsors: Reps. Tom McClintock (R-California), Dan Bishop (R-North Carolina), Thomas Tiffany (R-Wisconsin), Victoria Spartz (R-Indiana), W. Gregory Steube (R-Florida), Mike Johnson (R-Louisiana), Scott Fitzgerald (R-Wisconsin), Darrell E. Issa (R-California), Michelle Fischbach (R-Minnesota), Burgess Owens (R-Utah), Vern Buchanan (R-Florida), Randy Weber, Sr. (R-Texas), Tom Rice (R-South Carolina)
Date introduced: June 11
Status: Referred to the House Committee on Energy and Commerce
Categories: Limiting the Scope, Good Samaritan
Summary: This bill would narrow a platform’s ability to use Section 230 as a defense for content removal. A platform could no longer use (C)(1) as a defense in cases where it removes content, and the bill would narrow the scope of the (C)(2), the Good Samaritan provision. To use Section 230 as a defense, the platform would need to publicly state terms of service that detail criteria used in content moderation decisions. Platforms would also need to comply with those stated terms of service and content moderation criteria, and would need to ensure that content moderation is not made on deceptive grounds. When content is restricted, platforms would need to provide a rationale and an opportunity for the user to respond, with certain exceptions for law enforcement and imminent threats to safety.

Bill name: Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms (SAFE TECH) Act
Sponsor: Rep. A. Donald McEachin (D-Virginia)
Co-sponsors: Reps. Kathy Castor (D-Florida) and Mike Levin (D-California)
Date introduced: May 20
Status: Referred to the House Committee on Energy and Commerce
Category: Limiting the Scope
Summary: This is a companion bill to S. 299 introduced by Sens. Mark Warner (D-Virginia), Mazie Hirono (D-Hawaii), and Amy Klobuchar (D-Minnesota) in February. Under the SAFE TECH Act, companies would not be able to use Section 230 as a defense in cases related to ads or other content they are paid to make available. The act would also prevent a company from using Section 230 to prevent injunctive relief (e.g., a court order compelling the company to take some action) if it fails to “remove, restrict access to or availability of, or prevent dissemination of material” that could cause “irreparable harm.” The act would prevent platforms from using Section 230 as a defense in cases brought under several different federal and state laws, including civil rights, antitrust, stalking and harassment, human rights, and wrongful death laws. The SAFE TECH Act also puts the burden on companies that wish to use Section 230 as a defense to prove that they are “a provider or user of an interactive computer service,” and that they are “being treated as the publisher or speaker of speech provided by another information content provider.”

Bill name: 21st Century Foundation for the Right to Express and Engage in Speech Act (21st Century FREE Speech Act)
Sponsor: Sen. Bill Hagerty (R-Tennessee)
Co-sponsors:
Date introduced: April 27
Status: Referred to the Committee on Commerce, Science, and Transportation
Category: Repeal; Good Samaritan
Summary: The 21st Century FREE Speech Act would repeal Section 230 in its entirety. In its place, the bill would reclassify platforms covered under Section 230 as common carriers. Currently, platforms are allowed to determine which content and users can remain on the platform, and platforms such as Facebook and Twitter are under no legal obligation to provide their services to everyone. In contrast, a common carrier is required to provide its services to everyone, provided that basic criteria are met. For example, a telephone company can be considered a common carrier, as the company cannot arbitrarily prevent certain customers from placing calls or sending texts, provided that the customers have met the basic standard of paying for the plan. The reclassification of platforms as common carriers would impose additional responsibilities for platforms to provide their services to anyone.

This bill outlines obligations for “common carrier technology companies,” which are interactive computer services that offer services to the public and have at least 100 million global active monthly users. As common carrier technology companies, they would be required to provide their service to anyone without discriminating against individual users or classes of users and without discriminating based on political or religious affiliation or region. This bill would also require that platforms publicly disclose their policies regarding content moderation, promotion, curation, and account suspension.

Much like Section 230, this bill states that no provider or user of an interactive computer service will be treated as the publisher or speaker of content provided from another source. However, that treatment would not apply to any situation in which a platform takes action to change the visibility of content—through recommending content or restricting access, for example—thereby limiting the scope of the protections. The bill would also establish liability protections for the good-faith removal of content that is “obscene, lewd, lascivious, filthy, excessively violent, harassing, promoting self-harm, or unlawful, whether or not such material is constitutionally protected.” However, the liability protections would apply only to companies that remove content in accordance with their posted content moderation policies. The bill would establish a private right of action, and states could also bring investigations or lawsuits based on the provisions of the bill.

Bill name: Protecting Americans From Dangerous Algorithms Act
Sponsor: Reps. Tom Malinowski (D-New Jersey), Anna Eshoo (D-California)
Co-sponsors: Reps. Sean Casten (D-Illinois), Jason Crow (D-Colorado), Suzan DelBene (D-Washington), Mark DeSaulnier (D-California), Ted Deutch (D-Florida), Sara Jacobs (D-California), Barbara Lee (D-California), Joe Neguse (D-Colorado), Dean Phillips (D-Minnesota), Haley Stevens (D-Michigan), Debbie Wasserman Schultz (D-Florida), Peter Welch (D-Vermont)
Date introduced: March 23
Status: Referred to the House Committee on Energy and Commerce
Category: Limiting the Scope
Summary: The Protecting Americans From Dangerous Algorithms Act, reintroduced from the 116th Congress, would prevent platforms from using Section 230 as a defense in cases related to civil rights violations and terrorist acts if the companies use algorithms to disseminate and amplify the content at issue. Companies may still use Section 230 as a defense in these cases if they distribute content using methods that are “obvious, understandable, and transparent” to a reasonable user, including chronological order, alphabetical order, average user rating rankings, and sorting based on number of user reviews. The bill exempts cases where “a user specifically searches for” information and an algorithm or other computational process aids in returning search results. This bill differs from the 2020 version by exempting providers of internet infrastructure services - including web hosting, domain registration, content delivery networks, caching, data storage, and cybersecurity - from liability under this provision. The bill provides a small-business exemption for companies with fewer than 10 million users in at least three of the past 12 months. In the 2020 version of the bill, the small-business exemption applied to companies with fewer than 50 million unique monthly users/visitors for at least six of the preceding 12 months.

Bill Name: Stop Shielding Culpable Platforms Act
Sponsor: Rep. Jim Banks (R-Indiana)
Co-sponsors: Reps. Thomas Tiffany (R-Wisconsin), Guy Reschenthaler (R-Pennsylvania), Andy Barr (R-Kentucky), Ralph Norman (R-South Carolina), Randy K. Weber, Sr. (R-Texas), Dan Bishop (R-North Carolina), Brian Babin (R-Texas), Bob Gibbs (R-Ohio), Lisa C. McClain (R-Michigan)
Date introduced: March 18, 2021
Status: Referred to House Energy and Commerce Committee
Category: Imposing New Obligations
Summary: The act would amend Section 230 to clarify that it does not prevent a provider or a user of an interactive computer service from being treated as the distributor of information provided by another information content provider.

Bill name: Platform Accountability and Consumer Transparency (PACT) Act
Co-sponsors: Sens. Brian Schatz (D-Hawaii) and John Thune (R–South Dakota)
Date introduced: March 17, 2021
Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation
Category: Imposing New Obligations
Summary: The PACT Act was originally introduced in the 2019-2020 Congressional Session. This is an updated version that was reintroduced with meaningful changes in the 2021-2022 congressional session. Most notably, the updated bill carves outs exceptions for individual providers, outlines scalable requirements based on a platform’s revenue and size, and clarifies platform obligations regarding the complaint system, the phone line, and the transparency report.

Under the PACT Act, to receive Section 230 immunity, interactive computer service providers would be required to publish an acceptable use policy that would detail the types of content the provider allows, explain how the provider enforces its content policies, and describe how users can report policy-violating or illegal content. The PACT Act would also require providers to establish call centers with a live representative to assist users with the process of filing good-faith complaints eight hours per day, five days per week; to provide an email address to which users can submit complaints; and to create an easy-to-use complaint filing system that would allow users to file and track complaints and appeals. Because the act would require users to make complaints in good faith, providers would be permitted to filter complaints for spam, trolls, and abusive complaints.

The PACT Act would also require that a provider review and remove illegal and/or policy-violating content in a timely manner to receive Section 230 protections. Providers would be required to remove illegal content (as determined by a court) within four days of being put on notice of the illegal content and to initiate removal action for content that violates the provider’s publicized acceptable use policy within 14 days of receiving the complaint. Platforms would then be required to notify users that the provider had removed the content, give an explanation, and allow the user the opportunity to appeal the decision.

Providers must also issue biannual transparency reports, which would include the number of content-related complaints filed by users, the number of times the provider acted upon those complaints and the method of enforcement, and the number of appeals filed by users.

Under the PACT Act, small businesses that receive fewer than 1 million unique monthly visitors and have an accrued revenue of $50 million or less are exempt from the live call center requirement and have softened time constraints related to processing complaints. Small businesses would be allowed to process complaints of illegal and/or policy violating content within four and 21 days, respectively. Individual providers—like independent bloggers—who have fewer than 100,000 unique monthly visitors and an accrued revenue less than $1 million have minimal requirements under the act. They are only required to provide users with a contact system to alert the provider about content on their site, and they must remove illegal content within four days of notice. The PACT Act also exempts internet infrastructure companies from all provisions of the bill described above.

Bill name: Abandoning Online Censorship (AOC) Act
Sponsor: Rep. Louie Gohmert (R-Texas)
Co-sponsors: None
Date introduced: Feb. 5, 2021
Status: Referred to the House Committee on Energy and Commerce
Category: Repeal
Summary: This bill is an updated version of the AOC Act introduced in the 2019-2020 Congressional Session. The Abandoning Online Censorship (AOC) Act would repeal Section 230.

Bill Name: Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms (SAFE TECH) Act 
Sponsor: Announced by Sens. Mark Warner (D-Virginia), Mazie Hirono (D-Hawaii), Amy Klobuchar (D-Minnesota)
Co-sponsor: None
Date introduced: Announced Feb. 5, 2021
Status: Announced but not yet introduced
Category: Limiting the Scope
Summary: Under the SAFE TECH Act, companies would not be able to use Section 230 as a defense for issues related to ads or other content they are paid to make available. The act would also prevent a company from using Section 230 to prevent injunctive relief (e.g., a court order compelling the company to take some action) arising from the company’s failure to “remove, restrict access to or availability of, or prevent dissemination of material” that could cause irreparable harm. The act would prevent platforms from using Section 230 as a defense in cases brought under several different legal bases, including civil rights, antitrust, stalking and harassment, human rights, and wrongful death. The SAFE TECH Act also requires companies that wish to use Section 230 as a defense to prove that they are “a provider or user of an interactive computer service,” and that they are “being treated as the publisher or speaker of speech provided by another information content provider.”

Bill Name: See Something, Say Something Online Act of 2021
Sponsor: Sen. Joe Manchin (D-West Virginia)
Co-sponsor: Sen. John Cornyn (R-Texas)
Date introduced: Jan. 22, 2021
Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation
Category: Imposing New Obligations
Summary: This bill is an updated version of the See Something, Say Something Online Act of 2020 introduced in the 2019-2020 congressional session. The See Something, Say Something Online Act of 2021 would require interactive computer services to report suspicious transmissions that they detect and that show individuals or groups planning, committing, promoting, and facilitating terrorism; serious drug offenses; and violent crimes to the Department of Justice. The bill only requires providers to report suspicious transmissions they detect. It would not require providers to scan all content on their site to identify these transmissions. Providers would have to take “reasonable steps” to prevent and address such suspicious transmissions. Any provider that fails to report a suspicious transmission of which the provider should have reasonably been aware would not be able to use Section 230 as a defense and could be held liable as a publisher for the suspicious transmission.

Bill name: Curbing Abuse and Saving Expression in Technology (CASE-IT) Act
Sponsor: Rep. W. Gregory Steube (R-Florida)
Co-sponsors: Reps. Madison Cawthorn (R–North Carolina), Kevin Hern (R-Oklahoma), Ashley Hinson (R-Iowa), Jefferson Van Drew (R–New Jersey)
Date introduced: Jan. 12, 2021
Status: Referred to the House Committee on Energy and Commerce
Category: Imposing New Obligations, Good Samaritan
Summary: This bill is an updated version of the CASE-IT Act introduced in the 2019-2020 Congressional Session. The CASE-IT Act would prevent a company from using Section 230 as a defense for a period of one year if that company “creates, develops, posts, materially contributes to, or induces another person to create, develop, post, or materially contribute to illegal online content.” The CASE-IT Act would also require companies with market dominance that want to use Section 230 as a defense to follow content moderation policies that are consistent with the First Amendment. Companies could be considered “dominant” regardless of whether they have actual monopoly power. The act also includes a private right of action provision that would allow users to bring a civil action against dominant companies that fail to follow content moderation policies that are consistent with the First Amendment.

Bill name: Protecting Constitutional Rights From Online Platform Censorship Act 
Sponsor: Rep. Scott DesJarlais (R-Tennessee)
Co-Sponsor: None
Date introduced: Jan. 4, 2021
Status: Referred to the House Committee on Energy and Commerce
Category: Good Samaritan
Summary: The Protecting Constitutional Rights From Online Platform Censorship Act would strike the Good Samaritan blocking provision of Section 230 and would make it unlawful for any internet platform to restrict access or availability to content. Under the act, a user may sue a company if an internet platform restricts access and permits monetary relief to the user of no less than $10,000 but no more than $50,000 per action.

Bills from the 116th Congress

Bill name: A bill to amend the Internal Revenue Code of 1986 to increase the additional 2020 recovery rebates, to repeal Section 230 of the Communications Act of 1934, and for other purposes.
Sponsor: Sen. Mitch McConnell (R-Kentucky)
Co-sponsor: Sen. David Perdue (R-Georgia)
Date introduced: Dec. 29, 2020
Status: Read twice and placed on Senate Legislative Calendar under General Orders; no longer active
Category: Repeal
Summary: This proposal was part of the negotiation over COVID-19 relief in December 2020. It would increase the stimulus checks for COVID-19 from $600 to $2,000 in exchange for repealing Section 230 in its entirety, along with additional proposals related to election integrity.

Bill Name: A bill to repeal Section 230 of the Communications Act of 1934
Sponsor: Sen. Lindsey Graham (R–South Carolina)
Co-sponsors: None
Date introduced: Dec. 15, 2020
Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation; no longer active
Category: Repeal
Summary: The bill would repeal Section 230 of the Communications Act of 1934.

Bill name: Holding Sexual Predators and Online Enablers Accountable Act of 2020
Sponsor: Sen. Kelly Loeffler (R-Georgia)
Co-sponsors: Sen. Tom Cotton (R-Arkansas)
Date introduced: Dec. 11, 2020
Status: Read twice and referred to the Senate Committee on the Judiciary; no longer active
Category: Limiting the Scope
Summary: The Holding Sexual Predators and Online Enablers Accountable Act of 2020 would strip Section 230 liability protections from any company that willfully or recklessly promotes or facilitates child exploitation. Anyone who owns, manages, or operates an online platform that violates this rule would face fines and imprisonment of up to 25 years.

Bill name: Abandoning Online Censorship (AOC) Act
Sponsor: Rep. Louie Gohmert (R-Texas)
Co-sponsors: Reps. Andy Biggs (R-Arizona), Tom McClintock (R-California), Doug Lamborn (R-Colorado), Lance Gooden (R-Texas), Steve King (R-Iowa), Trent Kelly (R-Mississippi), Bob Gibbs (R-Ohio)
Date introduced: Dec. 8, 2020
Status: Referred to the House Committee on Energy and Commerce; no longer active
Category: Repeal
Summary: The Abandoning Online Censorship (AOC) Act would repeal Section 230.

Bill name: Curbing Abuse and Saving Expression in Technology (CASE-IT) Act
Sponsors: Rep. W. Gregory Steube (R-Florida)
Co-sponsors: Rep. Kevin Hern (R-Oklahoma)
Date introduced: Oct. 30, 2020
Status: Referred to the House Committee on Energy and Commerce; no longer active
Category: Imposing New Obligations, Good Samaritan
Summary: The CASE-IT Act would prevent a company from using Section 230 as a defense for a period of one year if that company “creates, develops, posts, materially contributes to, or induces another person to create, develop, post, or materially contribute to illegal online content.” The CASE-IT Act would also require companies with market dominance that want to use Section 230 as a defense to follow content moderation policies that are consistent with the First Amendment. Companies could be considered “dominant” regardless of whether or not they have actual monopoly power. The act also includes a private right of action provision that would allow users to bring civil action against dominant companies that fail to follow content moderation policies that are consistent with the First Amendment.

Bill name: Stop Suppressing Speech Act of 2020
Sponsor: Sen. Kelly Loeffler (R-Georgia)
Co-sponsors: None
Date introduced: Oct. 21, 2020
Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation; no longer active
Category: Limiting the Scope, Good Samaritan
Summary: The Stop Suppressing Speech Act of 2020 would amend (c)(2) so that companies would receive liability protection when they moderate content only if it falls within three categories: harassment, illegal content, or violence and terrorism.

Bill name: Protecting Americans From Dangerous Algorithms Act 
Sponsors: Rep. Tom Malinowski (D–New Jersey)
Co-sponsor: Rep. Anna Eshoo (D-California)
Date introduced: Oct. 20, 2020
Status: Referred to the House Committee on Energy and Commerce; no longer active
Category: Limiting the Scope
Summary: Under the Protecting Americans From Dangerous Algorithms Act, a company could not use Section 230 as a defense in cases brought for civil rights violations or acts of international terrorism and if the company uses an algorithm that amplifies or recommends content relating directly to the case. Companies may still use Section 230 as a defense in these cases if they sort information so it is delivered to the user in chronological order, alphabetical order, or based on average user rating or number of user reviews. This exception does not apply to small businesses that have 50 million or fewer unique monthly users in the previous 12 months.

Bill name: Protect Speech Act
Sponsor: Rep. Jim Jordan (R-Ohio)
Co-sponsors: Reps. James Sensenbrenner (R-Wisconsin), Louie Gohmert (R-Texas), Doug Collins (R-Georgia), Ken Buck (R-Colorado), Andy Biggs (R-Arizona), Tom McClintock (R-California), Debbie Lesko (R-Arizona), Guy Reschenthaler (R-Pennsylvania), Ben Cline (R-Virginia), W. Gregory Steube (R-Florida), Thomas Tiffany (R-Wisconsin), Kevin Hern (R-Oklahoma), Mike Johnson (R-Louisiana), Jim Hagedorn (R-Minnesota), Bill Posey (R-Florida), John Rutherford (R-Florida), Vern Buchanan (R-Florida), John Rose (R-Tennessee)
Date introduced: Oct. 2, 2020
Status: Referred to the House Committee on Energy and Commerce; no longer active
Category: Imposing New Obligations
Summary: This bill would remove the term “otherwise objectionable” from the list of reasons under which companies can remove or restrict access to content in good faith without losing the ability to use Section 230 as a defense. This bill would replace the term “otherwise objectionable” and would instead add more specific types of content that could be removed, including content that is illegal, promotes terrorism or violent extremism, or promotes self-harm. The bill would also require platforms that want to use Section 230 as a defense to publicize their terms of service and criteria used in content moderation practices, comply with those stated practices, not restrict content on deceptive or inconsistent grounds, and, except in certain cases related to public safety, provide notice to content providers when access to their content or its availability is restricted.

Bill name: Don’t Push My Buttons Act (House Version)
Sponsors: Rep. Paul Gosar (R-Arizona)
Co-sponsors: Reps. Tulsi Gabbard (D-Hawaii), Louie Gohmert (R-Texas), Eric Crawford (R-Arkansas)
Date introduced: Oct. 2, 2020
Status: Referred to the House Committee on Energy and Commerce; no longer active
Category: Limiting the Scope
Summary: Identical to Sen. John Kennedy’s bill of the same name, the Don’t Push My Buttons Act provides that companies cannot use Section 230 as a defense if they collect user data and then use the data in an algorithm that delivers content to the user, unless a user knowingly and intentionally elects to receive such tailored content.

Bill name: Don’t Push My Buttons Act (Senate Version) 
Sponsor: Sen. John Kennedy (R-Louisiana)
Co-sponsors: None
Date introduced: Sept. 29, 2020
Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation; no longer active
Category: Limiting the Scope
Summary: Identical to Rep. Paul Gosar’s bill of the same name, the Don’t Push My Buttons Act provides that companies cannot use Section 230 as a defense if they collect user data and then use the data in an algorithm that delivers content to the user, unless a user knowingly and intentionally elects to receive such tailored content.

Bill name: See Something, Say Something Online Act of 2020
Co-sponsors: Sens. Joe Manchin (D–West Virginia) and John Cornyn (R-Texas)
Date introduced: Sept. 29, 2020
Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation; no longer active
Category: Imposing New Obligations
Summary: The See Something, Say Something Online Act of 2020 would require interactive computer services to report suspicious transmissions that they detect and that show individuals or groups planning, committing, promoting, and facilitating terrorism; serious drug offenses; and violent crimes to the Department of Justice. The bill only requires providers to report suspicious transmissions they detect. It would not require providers to scan all content on their site to identify these transmissions. Providers would have to take “reasonable steps” to prevent and address such suspicious transmissions. Any provider that fails to report a suspicious transmission of which the provider should have reasonably been aware would not be able to use Section 230 as a defense and could be held liable as a publisher for the suspicious transmission.

Bill name: Online Freedom and Viewpoint Diversity Act
Sponsor: Sen. Roger Wicker (R-Mississippi)
Co-sponsors: Sens. Lindsey Graham (R–South Carolina), Marsha Blackburn (R-Tennessee)
Date introduced: Sept. 8, 2020
Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation; no longer active
Category: Good Samaritan
Summary: The Online Freedom and Viewpoint Diversity Act would amend Section 230 so that companies would receive liability protection only if they meet an objective reasonableness standard when moderating content. This standard would replace the term “otherwise objectionable” in (c)(2) and instead, a company would receive Section 230 protection only if it moderates content that objectively falls within certain categories, such as content that is illegal or promotes terrorism or self-harm.

Bill name: Stop the Censorship Act of 2020
Sponsor: Rep. Paul Gosar (R-Arizona)
Co-sponsors: Reps. Doug Collins (R-Georgia), Ralph Norman (R–South Carolina), Lance Gooden (R-Texas), Steve King (R-Iowa), Jim Banks (R-Indiana), Matt Gaetz (R-Florida), Ted S. Yoho (R-Florida), Thomas P. Tiffany (R-Wisconsin), Ron Wright (R-Texas), Glenn Grothman (R-Wisconsin), Eric A. “Rick” Crawford (R-Arkansas), Brian Babin (R-Texas), Doug Lamborn (R-Colorado), Bob Gibbs (R-Ohio), Andy Biggs (R-Arizona), Ross Spano (R-Florida), Fred Keller (R-Pennsylvania), James R. Baird (R-Indiana), Doug LaMalfa (R-California), Jody B. Hice (R-Georgia), Tulsi Gabbard (D-Hawaii), Robert J. Wittman (R-Virginia), John W. Rose (R-Tennessee)
Date introduced: July 29, 2020
Status: Referred to the House Committee on Energy and Commerce; no longer active
Category: Limiting the Scope, Good Samaritan
Summary: The Stop the Censorship Act of 2020 would amend Section 230 to prevent companies from using Section 230 as a defense if they act to restrict access to or availability of material they consider to be “objectionable.” Instead, the bill would more narrowly allow companies to block content that is “unlawful, or that promotes violence or terrorism.” The act also includes protections for platforms that provide users with tools to restrict the type of material they see on the platform.

Bill name: Stopping Big Tech’s Censorship Act 
Sponsor: Sen. Kelly Loeffler (R-Georgia)
Co-sponsor: None
Date introduced: June 24, 2020
Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation; no longer active
Category: Good Samaritan
Summary: The Stopping Big Tech’s Censorship Act would require companies to take reasonable steps to prevent and address any unlawful use or unlawful publication on its platform by its users. The act also states that a company that restricts access to or availability of constitutionally protected material cannot use Section 230 as a defense unless the action is taken in a viewpoint-neutral manner; the restriction limits only the time, place, or manner in which the material is available; and there is a compelling reason for restricting that access or availability. If a company places any restrictions on material, it must give an explanation of that decision.

Bill name: Platform Accountability and Consumer Transparency (PACT) Act
Co-sponsors: Sens. Brian Schatz (D-Hawaii) and John Thune (R–South Dakota)
Date introduced: June 24, 2020 (reintroduced March 2, 2021)
Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation; no longer active
Category: Imposing New Obligations
Summary: Under the PACT Act, to receive Section 230 immunity, platforms would be required to publish an acceptable use policy that would detail the types of content the platform allows, explain how the platform enforces its content policies, and describe how users can report policy-violating or illegal content. The PACT Act would also require platforms to establish call centers with a live representative to take user complaints eight hours per day, five days per week; to provide an email address to which users can submit complaints; and to create an easy-to-use complaint filing system that would allow users to file and track complaints and appeals.

The PACT Act would also require that a platform review and remove illegal and/or policy-violating content in a timely manner to receive Section 230 protections. Platforms would be required to remove illegal content (as determined by a court) within 24 hours and to remove content that violates the platform’s publicized acceptable use policy within 14 days. Platforms would then be required to notify users that the platform had removed the content, provide an explanation, and give the user the opportunity to appeal the decision.

Platforms must also issue quarterly transparency reports, which would include the number of content-related complaints filed by users, the number of times the platform acted upon those complaints and the method of enforcement, and the number of appeals filed by users.

Under the PACT Act, small businesses that receive fewer than 1 million monthly visitors and have an accrued revenue of $2 million or less are exempt from the live call center requirement and have softened time constraints related to processing complaints. Small businesses would be allowed to process complaints of illegal and/or policy violating content within a reasonable period of time. The PACT Act also exempts internet infrastructure companies from all provisions of the bill described above.

Bill Name: Ending Support for Internet Censorship Act
Sponsor: Sen. Josh Hawley (R-Missouri)
Co-sponsors: None
Date introduced: June 19, 2020
Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation; no longer active
Category: Imposing New Obligations, Good Samaritan
Summary: The act would withdraw Section 230 protections unless a company obtained an immunity certification from the Federal Trade Commission. A company would receive an immunity certification from the FTC if the company proves by clear and convincing evidence that it does not, and did not in the preceding two-year period, moderate user content in a politically biased manner. There are two exceptions to the immunity certification process. The first exception is when there is a business necessity for speech that would be protected under the First Amendment and there are no available alternatives that would have a less disproportionate effect on the speech and the provider did not act with the intent to discriminate based on political affiliation, political party, or political viewpoint. The second exception is when an employee’s actions clearly show that they acted in a biased manner, and the employer terminated or otherwise disciplined them.

Bill Name: Limiting Section 230 Immunity to Good Samaritans Act
Sponsor: Sen. Josh Hawley (R-Missouri)
Co-sponsors: Sens. Marco Rubio (R-Florida), Tom Cotton (R-Arkansas), Kelly Loeffler (R-Georgia)
Date introduced: June 17, 2020
Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation; no longer active
Category: Good Samaritan
Summary: The act would provide a more specific definition of “good faith” in subsection (c)(2) It would define “good faith” as when “the provider acts with an honest belief and purpose, observes fair dealing standards, and acts without fraudulent intent.”

Bill name: Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act
Sponsor: Sen. Lindsey Graham (R–South Carolina)
Co-sponsors: Sens. Richard Blumenthal (D-Connecticut), Josh Hawley (R-Missouri), Dianne Feinstein (D-California), Kevin Cramer (R–North Dakota), Doug Jones (D-Alabama), Joni Ernst (R-Iowa), Bob Casey (D-Pennsylvania), Sheldon Whitehouse (D–Rhode Island), Dick Durbin (D-Illinois), John Kennedy (R-Louisiana), Ted Cruz (R-Texas), Chuck Grassley (R-Iowa), Rob Portman (R-Ohio), Lisa Murkowski (R-Alaska), John Cornyn (R-Texas), Kelly Loeffler (R-Georgia)
Date introduced: March 5, 2020 (significantly amended July 20, 2020)
Status: Placed on Senate Legislative Calendar under General Orders; no longer active
Category: Limiting the Scope
Summary (of the amended version): The EARN IT Act would amend Section 230 so that platforms cannot use Section 230 as a defense in state criminal cases and federal and state civil cases regarding the proliferation of child sexual abuse material. In the significantly amended version, a provision was added to note that providers that use end-to-end encryption or are unable to decrypt communications will not face liability purely “because” these cybersecurity protections are built into the platform; the bill in its original state raised concerns that platforms may be liable for content that they are unable to decrypt. The bill would also create a National Commission on Online Child Sexual Exploitation Prevention to develop best practices for platforms to respond to the online sexual exploitation of children.

Any views expressed by the authors are their own and do not necessarily reflect the official policy or position of any public or private entity to which the authors may be affiliated.

This article has been updated to include the See Something, Say Something Online Act of 2021 and the Stop Shielding Culpable Platforms Act. It was also updated to note that all of the bills introduced in the 116th Congress are no longer pending.

Correction, March 23, 2021: This article originally misstated that news publishers can be liable when they serve as hosts for content. News publishers can use Section 230 as a defense against being liable when they serve as hosts for content.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.