Charade by Commission: New Bill Threatens to Undermine Online Safety
Rising awareness about the dissemination of harmful online content has triggered both real and spurious concerns about the efforts of technology companies to police and remove objectionable and illegal material from the Internet. Much of this ire has been directed at Section 230 of the Communications Decency Act, a commonly misunderstood law that provides the legal basis and certainty for Internet companies to remove a variety of illegal and otherwise harmful content without being held accountable for third party use of their services. Unfortunately, a bill introduced today titled the “Eliminating Abusive and Rampant Neglect of Interactive Technologies” (EARNIT) Act seeks to capitalize on concerns and misinformation about content moderation taken pursuant to Section 230 by tying important intermediary liability protections to foundational design choices. If enacted, this bill could undermine the ability of businesses to provide safe and secure online services.
The Bill Undermines Vital Intermediary Liability Protections
Sometimes called the Internet’s “most important law,” Section 230 makes the diverse services available on the open Internet as we know it today possible. As Senator Wyden, one of Section 230’s original authors has noted, the law provides not just a ‘shield’ but also a ‘sword’ for companies to remove objectionable online content without being dragged into court. This feature has been especially important for investment in and growth of America’s online startups, who could be crippled by frivolous lawsuits without protection from liability for the actions of third parties. Pursuant to the certainty that Section 230 provides, the technology industry is taking increasingly proactive steps to find, remove, and report illegal and harmful content and routinely engages with law enforcement to assist and respond to valid requests for information pertaining to criminal activity online.
Introduced by Senate Judiciary Chairman Graham along with several cosponsors, the bill would fundamentally reshape U.S. intermediary protections in the following ways:
- First, the bill would create a government-funded Commission under the auspices of the Attorney General with a mandate to develop “best practices” for preventing child sexual abuse material (CSAM) online. Section 230 liability protections for CSAM-related claims would be conditioned on a business certifying that they meet these “best practices.”
- Second, the bill would amend federal criminal law to permit lawsuits against companies that do not implement the Commission’s “best practices” for acting “recklessly.”
Protecting children from harm and abuse is one of the most important moral imperatives that society must undertake. Unfortunately, this bill contains major oversights and shortcomings that would fail to advance this mission. First, contrary to sponsors’ claims, Section 230 does not impede federal prosecution of distributors of CSAM. Second, the bill threatens to undermine existing efforts to remove harmful content from online services by converting providers’ moderation decisions into state action. Finally, the bill proposes to delegate far-reaching authority to the Attorney General that appears designed to weaken the security and trustworthiness of online services.
The Bill Misses the Mark
In an effort to curb CSAM online, the bill sets out to weaken intermediary liability protections under Section 230. However, as currently constructed, Section 230 does not appear to frustrate efforts to protect children online. Online service providers are already required by federal law, 18 U.S.C. § 2258A, to report the discovery of CSAM to the National Center for Missing and Exploited Children (NCMEC) and to preserve certain information to enable subsequent prosecution. Furthermore, Section 230 explicitly provides no immunity against the enforcement of federal criminal law, including prosecutions relating to the sexual exploitation of children. Simply put, if a company is illegally facilitating or failing to meet its legal obligations to combat CSAM, federal prosecutors can already hold it accountable.
The Bill Is Primed to Backfire
In addition to providing negligible value to law enforcement, existing efforts to police and prosecute harmful online content could be undermined by the bill. As described, the bill would create a Commission under the auspices of the Attorney General to develop “best practices” that businesses must follow to continue to receive vital intermediary liability protections. This Commission would be a government-appointed and -convened entity, funded by the government, with ultimate authority to approve “best practices” resting with the government. Therefore, any moderation taken by companies pursuant to the Commission’s “best practices” could be viewed as state action, opening the door to Constitutional claims and defenses by the authors and distributors of illegal online content.
Under current law, the private sector has flexibility and legal certainty to curtail bad actors on its own online services, but this bill threatens to limit that capacity. In the 2016 Circuit Court case U.S. v Ackerman, now-Supreme Court Justice Gorsuch ruled that NCMEC was functionally a state actor and committed a warrantless Fourth Amendment “search” by examining CSAM files reported by an online service provider. By the same logic, should this legislation be adopted, any moderation decisions taken by businesses pursuant to the Commission’s “best practices” could be ruled as state action to which Constitutional protections would attach. In this scenario, content takedowns made to maintain conformity with the Commission’s “best practices” could be objected to on the basis of the First Amendment’s protections for free speech. Furthermore, due process protections and warrant requirements for searching communications under the Fourth Amendment could be implicated by businesses’ efforts to police and report illegal content, disrupting the prosecution of CSAM traffickers.
A Dangerous Blank Check
The functionally mandatory “best practices” that the bill directs the Attorney General-helmed Commission to develop are far-reaching and could be used to compel businesses to redesign critical components of their services. For example, the Commission could mandate the imposition of invasive age verification requirements that have already been found contrary to the First Amendment when sought directly through legislation. The Commission could also restrict the ability of providers to offer communication services protected by encryption, which is already an explicit goal of the current Attorney General. As DisCo has previously addressed, strong encryption is increasingly vital for national security, U.S. economic competitiveness, and individuals’ physical and online safety. In sum, the Commission would have the leeway to constrain the features of online services in ways that are harmful to consumers and contrary to established law under the legal fiction that such “best practices” would be voluntary.
This bill rightly identifies some of the most pressing issues concerning law enforcement and modern technology. However, the bill’s answer is to abandon Congress’ policymaking responsibility to an unaccountable, heavily-skewed Commission that could pursue an end run around both laws on the books and well-established precedent to the detriment of both individuals and American industry. The values at stake in keeping individuals safe online are too important to address by administrative chicanery and deserve full consideration and engagement by impacted stakeholders through the Congressional process.