Magistrate’s Ruling Eases Burden for Officials Seeking Forced Disclosure of Encryption Passwords
One of the more interesting legal questions raised by digital technologies is this: Can the government compel a person to divulge the password to an encrypted computer without violating the Constitution’s protection against compelled self-incrimination? The question, which rarely arose a decade ago, is appearing with increasing frequency as full disk encryption becomes the default…
BY Thomas O'Toole STAFF CONTRIBUTOR
One of the more interesting legal questions raised by digital technologies is this: Can the government compel a person to divulge the password to an encrypted computer without violating the Constitution’s protection against compelled self-incrimination?
The question, which rarely arose a decade ago, is appearing with increasing frequency as full disk encryption becomes the default setting on many digital devices.
Law enforcement officials, who often have good reason to believe that evidence of crime is contained on a phone or laptop, frequently find that evidence is tantalizingly just out of reach — clearly apparent in file directories but inscrutably scrambled behind a password known only to the device owner.
Fears of Justice Denied
Encryption technology has been on law enforcement’s mind for some time now. Back in 2014, former FBI Director James B. Comey warned that, if encryption becomes the norm on digital devices, it will seriously hamper the government’s ability to enforce the law. “Justice may be denied, because of a locked phone, or an encrypted hard drive,” he said.
The prospect of “justice denied” was presented in a recent child pornography prosecution in San Francisco. There the police had gathered enough evidence to obtain a warrant to search Ryan Spencer’s home for evidence of child pornography. While executing the warrant, they found a dozen digital devices believed to contain child pornography. Three of them — a mobile phone, a laptop computer and an external storage device — were partially encrypted, thus preventing the police from reviewing their contents.
After attempting unsuccessfully to decrypt the three devices using technological means, the government sought an order from the trial court directing Spencer to divulge the passwords protecting the devices.
Spencer objected, arguing that it would violate the Fifth Amendment’s privilege against compelled self-incrimination if he were forced to divulge his passwords. According to Spencer, if he told the court the passwords, then that information would incriminate him because it contained an implicit admission that he owned devices containing child pornography.
U.S. Magistrate Judge Jacqueline Scott Corley of the Northern District of California rejected Spencer’s argument on March 20, finding that the police already had sufficient evidence linking him to the devices. The fact that Spencer’s utterance of the passwords indicated a connection to the devices was not constitutionally significant in this case, Judge Corley held, because the police already possessed that information. Spencer’s connection to the devices found in his home, she said, was a “foregone conclusion.”
Judge Corley’s formulation of the “foregone conclusion” doctrine provides law enforcement officials with a feasible path to obtaining orders to compel criminal defendants to divulge encryption passwords. And it is the latest ruling in what appears to be a trend rejecting broad Fifth Amendment protections for encryption passwords.
To the extent that other courts follow her reasoning, Judge Corley’s ruling could also diffuse calls from the law enforcement community for Congress to pass laws requiring backdoors and other technological means facilitating government access to electronic devices.
Foregone Conclusion Doctrine in the Digital Age
The foregone conclusion doctrine was developed in the context of government subpoenas for paper documents. In Fisher v. United States, 425 U.S. 391 (1976), the U.S. Supreme Court recognized that the Fifth Amendment’s protection against compelled self-incrimination could be violated if the very act of production “implicitly communicates statements of fact” that the government needs to build its case against a criminal suspect.
Nevertheless, the Fisher court said, the Fifth Amendment was not violated by the compelled production of potentially incriminating tax documents because the government already knew the existence of the documents and the fact that the defendant possessed them. Thus, the act of production did not give the government any evidence it did not already possess.
The “foregone conclusion” doctrine was recently applied to the digital realm in United States v. Apple MacPro Computer, 851 F.3d 238 (3d Cir. 2017), a case involving a Fifth Amendment challenge to a government demand for the password protecting a laptop computer. Because the government already knew that the defendant owned the laptop, an order directing him to divulge the password did not reveal anything incriminating that the government did not already know, the court said.
The court remarked in a footnote that “the foregone conclusion doctrine properly focuses on whether the government already knows the testimony . . . implicit in the act of production.”
Judge Corley took a similar approach. The government’s evidence contained admissions from Spencer that he owned both the iPhone and the laptop found in his residence. In fact, Spencer had unlocked the home screen of the phone before handing it over to the police during the raid. As for the external hard drive, the police offered testimony from a witness who had observed Spencer transferring files from the external hard drive to the laptop.
Judge Corley ruled that, against this factual backdrop, Spencer’s act of divulging his passwords would not give the government any incriminating evidence it did not already have. Spencer’s disclosure of the passwords would not, she said, contravene the Fifth Amendment’s protections against compelled self-incrimination.
Spencer has until April 20 to challenge Judge Corley’s order. The case raises an issue of first impression in the Ninth Circuit.
Ruling May Inform Wider Encryption Policy
Comey’s 2014 fears that digital encryption would hamper law enforcement were borne out in the aftermath of the December 2, 2015, mass shooting in San Bernardino, California. The FBI, unable to crack the password protecting the shooter’s iPhone, obtained a court order directing Apple to develop a software program to unlock the device. Apple resisted, making the argument that government-mandated backdoors to secured devices will make all individuals and institutions that depend on encryption less secure, and would also have the effect of encouraging repressive governments outside the United States to crack down on privacy-enhancing technologies.
The incident prompted the FBI and others to repeat calls for federal legislation mandating that device makers provide technological means for lawful access to digital devices. Even then-candidate, now President Donald Trump weighed in on the issue, siding with the law enforcement community.
The standoff ended when the government announced that it had found an alternative means to gain access to the contents of the device.
The issue of encryption’s impact on lawful access lives on, however.
In their article, “Encryption Workarounds,” University of Southern California Gould School of Law Prof. Orin S. Kerr and digital security expert Bruce Schneier point out that law enforcement officials have several ways to gain access to an encrypted digital device — none of which involve compelling a criminal suspect to divulge the password. For example, the police can find or guess the key, or they can exploit a flaw in the encryption software.
Kerr and Schneier argue that legislation mandating backdoors to encrypted devices should wait until judicial rulings — such as Judge Corley’s — reveal whether encryption is truly harming law enforcement’s ability to do its job. “Until the law of encryption workarounds becomes clear, it is difficult to assess how much encryption will prove a practical barrier to investigations and in what kinds of cases the barriers will be greater or lesser,” they wrote.
Have you wondered how videos get views? As you likely guessed, there is a process for YouTube’s recommendation engine.