New Look ‘EARN IT’ Act Poses Same Threats to Online Safety
Back in March, DisCo discussed a well-intentioned but flawed bill titled the “Eliminating Abusive and Rampant Neglect of Interactive Technologies Act” (or ‘EARNITA’). In short, seeking to press companies to do more to combat child sexual abuse material (CSAM) online, the legislation threatened to undermine cybersecurity protections that keep us all safe and make prosecuting the distributors of CSAM more difficult. Responding to widespread, bipartisan concern about these serious problems, Senators rushed through significant amendments to the bill during a July markup. Unfortunately, despite changing the bill’s approach to addressing CSAM online, these amendments did little to mitigate the underlying threats that EARNITA poses to online security, free speech, and the prosecution of bad actors. Now, with legislators in both chambers of Congress poised to consider the reformulated EARNITA, stakeholders should be cognizant of how the bill continues to threaten online safety.
What’s Changed in the New ‘EARN IT’ Act?
For starters, the ‘earn it’ backronym is now a misnomer. The original bill sought to empower a U.S. Attorney General-led Commission to promulgate so-called ‘best practices’ that organizations would be required to follow in order to preserve (or ‘earn’) important intermediary liability protections under Section 230 of the Telecom Act. As many noted at the time, the power and composition of the Commission made it a virtual certainty that these ‘best practices’ would functionally compel services to adopt design choices that would weaken the security of their products. The modified EARNITA retains this Commission, but cuts out its ‘man-in-the-middle’ function by directly carving out from intermediary liability protections a wide range of state law claims related to CSAM.
There is broad consensus that policymakers, industry, and law enforcement can work more closely to combat the scourge of online child exploitation. Unfortunately, the current legislative focus on weakening Section 230 is not an effective approach for accomplishing this shared aim. Federal law, 18 U.S.C. § 2258A, already requires service providers to report the discovery of CSAM on their services and take steps to assist in the prosecution of criminals that distribute and receive this material. Furthermore, Section 230 provides no impediment to federal prosecution of companies that illegally facilitate or fail to meet their legal obligations to combat CSAM. In fact, while services are reporting millions of pieces of illegal content per year, subsequent prosecution of the perpetrators appears to be dwindling.¹
The application of consistent and predictable national standards of liability for the content generated and shared by third parties has powered the U.S. digital economy and given companies the legal certainty necessary to find and combat harmful online material. That is why Section 230 is sometimes called the Internet’s “most important law.” The modified EARNITA seeks to open companies to a range of criminal and civil litigation under a patchwork of state laws with reduced and untested standards of liability for their product design and content moderation choices. As a result, companies would lose predictability for their efforts to combat objectionable third-party content and protect their users, opening a Pandora’s box of potential unintended consequences.
1. Threats to Online Security
As DisCo readers know, strong encryption is critical for national security, a vibrant and competitive digital economy, and the online and physical safety of individuals including children. The original EARNITA was called a “backdoor to a backdoor,” describing the drafters’ express intent to weaken encryption while offloading the unpopular work of actually passing legislation to make digital products and services less secure to a facially-neutral Commission that in practice would be anything but. An amendment to EARNITA offered by Senator Leahy seeks to assuage these concerns by providing that liability shall not arise because a company utilizes encryption services. While a welcome step forward, the amendment is unlikely to effectively safeguard cybersecurity protections as it is underinclusive, overly vague, and has also been narrowed in the House companion.
First, EARNITA would still create legal uncertainty and invite protracted litigation over whether a state liability claim arises “because” of the use of encryption (rather than as a contributing cause), seriously disincentivizing the development and use of these crucial protections, especially by small and medium-sized enterprises. Second, EARNITA would still enable state legislatures to enact new laws restricting the ability of services to design features that protect the privacy and security of users that arguably tip-toe around the specific use of encryption protocols, such as requiring “client-side scanning” or the addition of “ghost users.” Finally, while the EARNITA Commission’s ‘best practices’ no longer carry de facto regulatory authority, they remain a threat to online safety because they would likely be considered by courts in evaluating whether a company has behaved “negligently” or “recklessly” under state law.
2. Threats to Online Speech
The legal uncertainty EARNITA would unleash for offering a service that can host or transmit user-generated content would in all likelihood cause some providers to decide not to introduce new features or to shut down entirely due to dramatically expanded liability risks. Providers that continue to offer these services will be incentivized to over-moderate and filter lawful content, shrinking the scope and diversity of Constitutionally-protected free speech online. What’s more, experts anticipate that the suppressed expression resulting from EARNITA would disproportionally impact LGBTQ communities and children. Recent history provides support for these concerns. The SESTA/FOSTA legislation passed in 2018 that similarly carved out from Section 230 immunity user content facilitating sex trafficking has already negatively impacted online speech and marginalized communities. EARNITA threatens to cause even more harm by permitting broader state-level claims that are unmoored from any consistent federal standard of liability.
3. Threats to the Prosecution of Bad Actors
Under existing law, providers have the flexibility and legal certainty to pursue and report the hosting and transmission of illegal content on their services, which they do millions of times a year. Fourth Amendment jurisprudence necessitates that the government be very careful when legislating in this space. Any law with the effect of compelling providers to scan their services for CSAM and report their users to the federal government could be construed as transforming a service provider into a “state actor” for the purposes of that search. This would trigger the applicability of Constitutional protections and warrant requirements. As a result, evidence of criminal activity that is currently voluntarily detected by providers could be thrown out of court. State laws made relevant under EARNITA that create liability for “negligent” or “reckless” conduct may have the effect of compelling providers to search for CSAM on their services, raising “state actor” problems that will make prosecuting criminal activity more difficult.
The EARNITA legislation targets a real and pressing problem facing law enforcement in the modern digital environment. Unfortunately, by remaining focused solely on Section 230, EARNITA continues to miss the mark when it comes to combating the exploitation of children and threatens serious unintended consequences to online safety. Keeping individuals, particularly children, safe online is a critical priority that will require full consideration and engagement by impacted stakeholders to find effective solutions.