Contact Us


Disruptive Competition Project

655 15th St., NW

Suite 410


Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Contact Us

Please fill out this form and we will get in touch with you shortly.
Close

House Antitrust Nondiscrimination Bill May Require Covered Platforms to Host Discrimination

In a previous post, DisCo discussed the recent legislative proposal, H.R. 3816, the “American Choice and Innovation Online Act”, the so-called “Nondiscrimination Bill,” that was recently voted out of the House Judiciary Committee on June 23, 2021. During the markup, members on both sides of the aisle supported any number of the six antitrust proposals that were voted upon and some indicated that they voted in support of the bills to address the content moderation policies of a handful of digital services. While content moderation has no place in a competition policy debate, the issue as to whether competition policy can be used as a tool to address content moderation was never explicitly answered by the members of Congress at the markup. However, when the panel voted affirmatively to send this bill to the floor, policymakers made a drastic content moderation decision. H.R. 3816, which attempts to promote competition by regulating a handful of digital services, could have serious consequences for digital services’ content moderation policies. This legislation may require these digital services to host dangerous and reprehensible content such as white supremacist or neo-Nazi websites and messaging, amongst a host of other malign content.

What is the bill supposed to do?

H.R. 3816 grants the Federal Trade Commission (FTC) the mandate to classify a digital service as a “Covered Platform.” Specifically, the bill considers the services offered by “Covered Platforms” of a distinct critical nature, and forces these companies to conduct business under an obligation to be neutral vis-à-vis other competitors and service providers that might be willing to operate on the “Covered Platforms.” To ensure this artificially created neutrality, the bill imposes limitations and obligations on the normal conduct of business of these “Covered Platforms.” Among others, it prohibits “Covered Platforms” from engaging in self-preferencing, discrimination, and exclusionary arrangements. The bill also foresees the possibility of seeking private rights of action for infringement. It’s the bill’s forced obligation to treat the content of other competitors and service providers in a neutral way that requires digital services to host all kinds of content, even harmful content.

What’s the problem with the bill and how does it affect content moderation?

While DisCo has discussed the problems with this bill in previous posts, this post raises another one: content moderation. The bill has major implications for “Covered Platforms”’ content moderation decisions; its language would allow offensive content to proliferate and be hosted online on their digital services. 

Specifically, Section 2 of the bill provides:

It shall be unlawful for a person operating a covered platform, in or affecting commerce, to engage in any conduct in connection with the operation of the covered platform that—

(1) advantages the covered platform operator’s own products, services, or lines of business over those of another business user;

(2) excludes or disadvantages the products, services, or lines of business of another business user relative to the covered platform operator’s own products, services, or lines of business; or

(3) discriminates among similarly situated business users.

Furthermore, Section 3 states that it is unlawful to:

(9) restrict or impede a business user, or a business user’s customers or users, from interoperating or connecting to any product or service; and

(10) retaliate against any business user or covered platform user that raises concerns with any law enforcement authority about actual or potential violations of State or Federal law.

While it may not be Congress’s intention, the language would have unintended consequences for content moderation because the bill provides that there will be an antitrust violation if there is “discrimination among similarly situated business users.” By implication, a “Covered Platform” must now carry websites that may not comply with its terms of service. 

This bill would therefore require “Covered Platforms” to carry such deplorable content as white supremacist propaganda or neo-Nazi websites, that “Covered Platforms” can presently remove if it violates their policies. Under H.R. 3816, “Covered Platforms” cannot discriminate against certain content, i.e., remove websites, because doing so would treat them differently than “similarly situated business users.” Additionally, enforcement of H.R. 3816 falls on government entities such as “any Attorney General of a state,” which could open the door for lawsuits claiming the digital services’ content moderation violated the law, and as mentioned private civil actions also can be brought. Policymakers need to know that their vote on this bill has major consequences for content moderation, and that they have now opened the floodgates to reprehensible content that digital services must host. This so-called “Nondiscrimination” bill contradicts itself and requires a handful of digital services to host discriminatory, reprehensible, and malign content.

Competition

Some, if not all of society’s most useful innovations are the byproduct of competition. In fact, although it may sound counterintuitive, innovation often flourishes when an incumbent is threatened by a new entrant because the threat of losing users to the competition drives product improvement. The Internet and the products and companies it has enabled are no exception; companies need to constantly stay on their toes, as the next startup is ready to knock them down with a better product.