Yesterday brought another Congressional hearing on Section 230 of the Communications Decency Act. Following the Senate Commerce hearing several weeks ago on SESTA, yesterday’s hearing in the House Judiciary Subcommittee on Crime, Terrorism, Homeland Security, and Investigations focused on online sex trafficking more generally, rather than any specific legislation. Several witnesses, including former Rep. Chris Cox, a co-author of Section 230, provided expert views on the issue.
While the hearing was more broad, amendments to Section 230 were discussed, including adding a knowledge standard. I.e., why shouldn’t online platforms incur sex trafficking liability as soon as they have knowledge of misuse of their services?
This is a logical question, since the much-reviled Backpage.com allegedly knew that criminal activities occurred on its site, while removing objectionable words from ads which had the effect of obscuring illegality. Thus, to impose liability upon knowledge would appear to respond to this problem.
However, this overlooks that online platforms already monitor and moderate content for a host of problems extending far beyond sex trafficking, whether to prevent hate speech, child endangerment, or illegal transactions (like narcotics and firearms), among numerous other subjects. In policing against abuse, reviewers constantly make judgment calls about short, ambiguous pieces of content which may be divorced from any context.
To attach liability when an intermediary has knowledge of content would discourage services from policing for all kinds of bad actors. Missed calls happen, and if a missed call about one posting could provide the requisite knowledge for a trafficking charge, many intermediaries might be directed by lawyers to stop policing at all. In that case, everyone loses.
Indeed, Prof. Eric Goldman discussed this issue in the Sept. 19 Senate hearing on SESTA. This problem was, after all, why Congress enacted Section 230 of the Communications Decency Act in the first place. Congress aimed to overturn Stratton Oakmont, in which a corrupt financial services firm successfully sued the online service Prodigy over defamatory posts by arguing that Prodigy’s policing of objectionable content made Prodigy responsible for all content.
If knowledge were to deprive intermediaries of legal protection, then many intermediaries might abandon voluntary measures and not monitor at all, for fear of criminal prosecution. While there may be disagreements about the right path forward, not returning to the Stratton Oakmont era is in everyone’s interest.