Contact Us


Disruptive Competition Project

655 15th St., NW

Suite 410


Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Contact Us

Please fill out this form and we will get in touch with you shortly.
Close

Internet Users at Risk from State-by-State Regulation of Internet Content Moderation

assorted-color abstract painting

Amid an ongoing federal conversation about digital services’ content moderation, several state policymakers are pursuing legislation on content moderation.  Last week, for example, Florida Governor DeSantis held a press conference endorsing the state legislature’s proposed bills targeting Internet companies because of their content policies.  Because content removal is one means by which digital services protect the trust and safety of users and the public at large, these proposals stand to harm users, even as sponsors profess to protect them.  While scholars and courts have generally regarded such proposals are preempted by federal law, including Section 230 and the First Amendment, policymakers, users, and digital services alike should be concerned about their impact.

As a Kentucky article recently noted, some of these state legislative efforts may have come in response to services enforcing acceptable use policies against accounts held by former President Trump in response to his incitement of an insurrection in the U.S. Capitol last month.  Whatever the impetus for this flurry of legislative proposals, the impact would resonate far beyond the events of January 6.

Some of the legislative proposals, including bills in Kentucky, Utah, Oklahoma, Florida, and Arkansas, appear to be based on text that has circulated in state capitals for several years under the moniker of the “Stop Social Media Censorship Act.”  As local reporters have investigated in Arkansas and Utah, these proposals appear connected to a controversial activist whose other claims to fame include attempting to marry a laptop, ostensibly in protest of marriage equality.  These bills would provide Internet users the right to sue a “social media website” that deleted or suppressed their political or religious speech (but, strangely, not if the site is affiliated with a political party or religion, a limitation that raises conspicuous First Amendment flags).  The consequences of these bills would reach far beyond political or religious speech.  For example, content that might be framed as political or religious speech could encourage practices contrary to public health, endangering people amid a pandemic, placing digital services in a position of choosing between avoiding litigation or protecting users from potential harm.

Other recent bills take different approaches.  North Dakota’s House Bill 1144, for example, repurposes language from the federal Good Samaritan provision, Section 230, and would permit civil actions against social media sites for restricting content, with relief of civil damages and attorney’s fees available to not only the poster of the information in question, but also any “person that otherwise would have received the writing, speech, or publication.”  This type of legislation could force services to second guess every removal decision and inhibit rapid responses to emerging threats.

Arizona’s House Bill 2180 goes in a different direction.  Under that proposal, a person engaged in the business of allowing online users to upload publicly accessible content on the internet that exercises a level of control over the uploaded content for “politically biased reasons” would be (1) “deemed to be a ‘publisher’,” (2) “deemed to not be a ‘platform’,” and (3) liable for damages suffered by an online user because of the person’s actions, which could be brought by the AG or the user.  Additionally, the “publisher” would have to pay the AG an annual fee for each user in this state authorized to upload publicly accessible content to its service.  

This bill starts from the erroneous notion that online sites must be either “publishers” or “platforms,” a misconception that is so prevalent it has become a form of pseudolaw.  Many online sites, of course, are both: newspapers are undoubtedly publishers, but are “platforms” with respect to their comment sections. Many websites regarded as “platforms” also create their own content, independent of user-generated content.  Neither term appears in the relevant federal law, Section 230(f), which refers only to “interactive computer services” and “information content providers.”  By starting with the mistaken publisher/platform distinction, however, the Arizona bill would result in some bizarre outcomes, such as requiring news publishers to pay fees in Arizona for operating a moderated comment section on their articles.

These state efforts are not realistically administrable, and are likely preempted by federal law.  Their broader impact, should they be enacted, would be to discourage digital services’ efforts to safeguard the trust and safety of Internet users, by creating a patchwork quilt of state-by-state regulatory obligations for companies attempting to suppress clearly objectionable content.  Such breadth and vagueness in a statute would also invite an avalanche of frivolous litigation — which Section 230 was designed to reduce so that American digital innovation could flourish. 

These misguided efforts may also attach varying levels of state-by-state legal risk to companies’ efforts to restrict content that is likely lawful, but potentially harmful, such as promoting self-harm, or religious intolerance, or foreign-originating misinformation about vaccines or the ongoing pandemic.  It is for these reasons that Internet content regulation has long occurred at the federal level.

Competition

Some, if not all of society’s most useful innovations are the byproduct of competition. In fact, although it may sound counterintuitive, innovation often flourishes when an incumbent is threatened by a new entrant because the threat of losing users to the competition drives product improvement. The Internet and the products and companies it has enabled are no exception; companies need to constantly stay on their toes, as the next startup is ready to knock them down with a better product.