Last month, we expressed concerns about how the STOP CSAM Act threatens encrypted communications and free speech online. New amendments to the bill have some improvements, but our concerns remain. 

The STOP CSAM Act Should Not Use the EARN IT Act as a Template for How to Protect Encryption 

The amendments to the STOP CSAM Act make the bill similar to the EARN IT Act, which is to say, still highly dangerous for encryption

TAKE ACTION

TELL CONGRESS NOT TO OUTLAW ENCRYPTED APPS

In their current versions, both the STOP CSAM and EARN IT bills remove Section 230 immunity for civil claims against internet intermediaries for injuries involving CSAM, although at the moment EARN IT creates the possibility of a broader range of state and federal claims against platforms (as well as broader criminal liability). 

Two main amendments limit the scope of the civil claim in the STOP CSAM Act. 

First, the latest amendments introduce the same misleading “encryption exception” found in the EARN IT Act. We’ve written at length about why the encryption exception is insufficient. Although the exception purports to protect online platforms from liability for offering encrypted services, it specifically allows the use of encryption to be introduced as evidence of the facilitation of illegal material. 

Second, the civil claim previously could be premised on “negligent” behavior by the platform. The amendments now require that the civil claim be premised on at least “reckless” behavior by the platform (or intentional or knowing behavior). The removal of the low negligence standard is an improvement. But combined with the weak protection for encryption, we still think it’s likely that plaintiffs will argue that companies that merely offer end-to-end encryption are “recklessly” enabling the sharing of illegal content on their platforms by failing to scan for and remove that content. That’s simply not how true end-to-end encryption works.

The amendments also limit the bill’s most dangerous criminal provision, which allows platforms to be convicted of a crime of “hosting or storing” or “promoting or facilitating” CSAM. We previously wrote that this new crime might be interpreted as reaching merely passive behavior, again imperiling encrypted services. The bill now specifies that a platform must have “knowledge” of the illegal content in order to be criminally liable, and that it is a defense that the company cannot remove it (such as when it is encrypted content uploaded by a user without the providers’ knowledge). These are improvements, but the question remains why this new crime is needed when it is already a federal crime for anyone to promote or distribute CSAM. 

The STOP CSAM Act Still Incentivizes Takedowns of Potentially Lawful Content

The Section 230 Carveout Is Still a Problem

The civil claim against internet intermediaries still includes an exception to Section 230 immunity. As we explained last month, creating a new exception to Section 230 that allows companies to be sued for “facilitating” child sexual exploitation merely based on the provision of a platform that hosts third-party content will harm free speech online. Online services will censor user content and accounts to mitigate the companies’ legal exposure. This will harm all internet users who rely on intermediaries to connect with their communities and the world at large. 

(One drafting point: the bill doesn’t actually amend Section 230 directly. Instead, the bill says, “Nothing in Section 230 shall be construed to impair or limit any claim brought under this section for conduct relating to child exploitation.” Responsible drafting would create an exception for the STOP CSAM civil claim within Section 230 itself. It’s bad public policy to place statutory exceptions in other statutes, making them harder to find and inventory.)

The New Affirmative Defense Will Still Incentivize Overbroad Content Takedowns

In lieu of Section 230 immunity, the civil claim now includes a new defense: an internet intermediary is not liable if the content in question was removed within 72 hours of “obtaining knowledge” that the content was being hosted, stored, or made available on the platform, or if the company provides evidence that it couldn’t remove the content.  

This defense is insufficient for minimizing the harm to online speech that the bill will cause. Content that is the subject of this notice-and-takedown regime need not be determined by a court to be, in fact, illegal CSAM. And a company could only use this defense after a plaintiff has initiated a costly lawsuit. Online services will, therefore, be incentivized to remove content to avoid being sued (though there is nothing preventing them from still being sued even after they remove the content). Such takedowns will surely occur without regard to whether the content might actually be lawful speech, creating an effective mechanism for internet trolls to censor legitimate content, a classic heckler’s veto

The bill does include some improvements on the speech front. We previously expressed concern that the STOP CSAM Act required, in two contexts, that online services remove content before it has been adjudicated to actually be illegal CSAM: in conjunction with a report to NCMEC of “apparent” CSAM that the company has actual knowledge of, and pursuant to a complaint submitted to the new Child Online Protection Board. This would have had significant implications for legitimate speech. The Board process, in particular, would have been ripe to be gamed by bad actors (similar to the civil claim discussed above), leaving lawful user content exposed to bogus takedown requests. We are pleased that both pre-adjudication content takedown provisions are not included in the latest version.

TAKE ACTION

TELL CONGRESS NOT TO OUTLAW ENCRYPTED APPS

Child Online Protection Board Process is Confusing 

Amendments to the Board process have added more confusion around whether content may be taken down before Board review. The informal negotiation process remains in the bill, whereby the company may inform the complainant that it does not believe the content is illegal CSAM. If the complainant doesn’t respond within 14 days, the company is off the hook. If the complainant does respond, then the Board process moves forward, which can include pre-adjudication removal of content. But the amendments now state that an online service may also respond to a complainant that the company has “determined that visual depiction referenced in the notification does not constitute a proscribed visual depiction relating to a child.” This appears to be added to allow the platform to avoid removing the content prior to Board review. But the interaction with the informal negotiation process is confusing. Congress should not leave it to the courts to interpret an unclear interaction between two statutory provisions.

We previously also raised concerns about the gag rule related to Board review, whereby the creator of content that is the subject of a (potentially frivolous) takedown request will not be put on notice of the process until a final determination. The amendments change the non-disclosure period from 180 to 120 days, a slight improvement. The new bill continues to include provisions that allow an online service to challenge a gag order: “The provider may submit an objection to the Board that nondisclosure is contrary to the interests of justice.”   

Finally, we previously flagged that online services that participate in the Board process give up the right to have a takedown request reviewed by a federal court, which implicates due process rights. This provision remains in the latest version.

Congress should not pass this law, which will undermine security and free speech online. Existing law already requires online service providers who have actual knowledge of “apparent” CSAM on their platforms to report that content to the National Center for Missing and Exploited Children (NCMEC), which is essentially a government entity.  NCMEC then forwards actionable reports to law enforcement agencies for investigation. And as we’ve said before, Congress and the FTC have many tools already at its disposal to tackle CSAM, some of which are not used. 

Please tell Congress not to move forward with this dangerous law unless it undergoes substantial changes. 

TAKE ACTION

TELL CONGRESS NOT TO OUTLAW ENCRYPTED APPS

ASK INTELWAR AI

Got questions? Prove me wrong...