A new U.S. Senate bill would require private messaging services, social media companies, and even cloud providers to report their users to the Drug Enforcement Administration (DEA) if they find out about certain illegal drug sales. This would lead to inaccurate reports and turn messaging services into government informants.

The bill, named the Cooper Davis Act, is likely to result in a host of inaccurate reports and in companies sweeping up innocent conversations, including discussions about past drug use or treatment. While explicitly not required, it may also give internet companies incentive to conduct dragnet searches of private messages to find protected speech that is merely indicative of illegal behavior.

Most troubling, this bill is a template for legislators to try to force internet companies to report their users to law enforcement for other unfavorable conduct or speech. This bill aims to cut down on the illegal sales of fentanyl, methamphetamine, and counterfeit narcotics. But what would prevent the next bill from targeting marijuana or the sale or purchase of abortion pills, if a new administration deemed those drugs unsafe or illegal for purely political reasons? As we’ve argued many times before, once the framework exists, it could easily be expanded.

The Bill Requires Reporting to the DEA

The law targets the “unlawful sale or distribution of fentanyl, methamphetamine” and “the unlawful sale, distribution or manufacture of a counterfeit controlled substance.”

Under the law, providers are required to report to the DEA when they gain actual knowledge of facts about those drug sales or when a user makes a reasonably believable report about those sales. Providers are also allowed to make reports when they have a reasonable belief about those facts or have actual knowledge that a sale is planned or imminent. Importantly, providers can be fined hundreds of thousands of dollars for a failure to report.

Providers have discretion on what to include in a report. But they are encouraged to turn over personal information about the users involved, location information, and complete communications. The DEA can then share the reports with other law enforcement.

The law also makes a “request” that providers preserve the report and other relevant information (so law enforcement can potentially obtain it later). And it prevents providers from telling their users about the preservation, unless they first notify the DEA.

We Have Seen This Reporting Scheme Before

The bill is modeled off existing law that requires similar reporting about child sexual abuse material (CSAM). Lawmakers also previously tried and failed to use this reporting scheme to target vaguely defined terror content. This bill would port over some of the same flaws.

Under existing law, providers are required to report actual knowledge of CSAM to a group called the National Center for Missing and Exploited Children, a quasi-governmental entity that later forwards on some reports to law enforcement. Companies base some of their reporting on matches found by comparing digital signatures of images to an existing database of previously removed CSAM. Notably, this new bill requires reporting directly to the DEA, and the content at issues (drug sales) is markedly harder and more subjective to identify. While actual CSAM is unprotected by the First Amendment, mere discussion of drug use is protected speech. Due to the liability they would face for failing to report, some companies may overreport using content-scanning tools that we know have large error rates in other contexts.

Despite strong challenges, the existing CSAM reporting law has so far survived Fourth Amendment scrutiny because the government does not explicitly compel providers to search through their users’ communications (it only requires reporting if providers decide to search on their own). However, some applications of existing law have violated the Constitution—specifically, when providers make a report without fully examining the material they are reporting. In those cases, law enforcement has been deemed to have exceeded the scope of the private search by providers, which should require a warrant.

Like with this bill, a separate piece of the existing CSAM law requires providers to preserve user content after making a report. But there is increasing recognition that this compelled preservation constitutes a Fourth Amendment seizure that removes a user’s rights to delete their own content.

We Should Strengthen the Privacy of User Communications, Not Weaken Them

After years of attempts to weaken privacy, lawmakers should focus their interest on strengthening protections for user content. Under the 1986 Electronic Communications Privacy Act (ECPA), providers are generally restricted from handing over user information to law enforcement without some kind of legal process—whether it be a warrant, court order, or subpoena. However, this bill creates another carveout.

Rather than carving up ECPA, we need to update and strengthen the decades-old protections. EFF has been making this argument for more than a decade. And states like California have charted a path forward and will hopefully continue.

More immediately, if lawmakers do not abandon the Cooper Davis Act, the worst aspects must be avoided. When considering amendments, lawmakers should:

  • Make the reporting scheme entirely voluntary
  • Require the DEA to delete reports that contain innocent content, and prevent the DEA from targeting individual purchasers based on a report
  • Commission a study and create a sunset date to see if this reporting scheme even serves its stated purpose
  • At minimum, require the government to get a warrant for the lengthy preservation of content associated with a report
  • Make it easier for companies to notify their users about preservation requests, similar to the NDO Fairness Act

ASK INTELWAR AI

Got questions? Prove me wrong...