In this post we’re going to take a look at the recent Supreme Court ruling on 4th amendment rights and smartphones and how this ruling could impact the ongoing debate over NSA spying. We’re also going to look at the other side of the coin: the 5th Amendment right against self-incrimination during a time when encryption tools strong enough to thwart law enforcement and the NSA are becoming increasingly mainstream. Is encryption like a strongbox or a wall safe? You might be surprised by just how important that question has become.
————-
The Supreme Court made an important, and unanimous, ruling recently regarding the legality of law enforcement officers searching someone’s smartphones during an arrest. The ruling: Warrants are required. The reasoning: Smartphones contain so much information about people’s lives that you can potentially learn more about an individual by searching their smartphone than you would learn while searching their house:
Los Angeles Times
Supreme Court ruling affirms the astonishing power of smartphonesRobin Abcarian
June 25, 2014, 2:34 PM
Wednesday’s unanimous Supreme Court ruling – that officers must obtain warrants in order to search cellphones obtained during the course of arrests – shows the justices’ profound understanding of the way these ubiquitous little devices have practically become appendages of the human body.
Chief Justice John R. Roberts even got a little carried away with that metaphor when he wrote in his entertaining opinion that modern cellphones “are now such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy.”
Giving police the ability to search a cellphone without a warrant, the court said, is as offensive as the intrusions that led the birth of this country and the creation of its Constitution.
The 4th Amendment, with its protection against unreasonable searches, Roberts said, “was the founding generation’s response to the reviled ‘general warrants’ and ‘writs of assistance’ of the colonial era, which allowed British officers to rummage through homes in an unrestrained search for evidence of criminal activity. Opposition to such searches was in fact one of the driving forces behind the Revolution itself.”
As the chief justice noted, today’s smartphones are not “just another technological convenience.” They are indispensable repositories for exceedingly private details about an individual’s life.
(How indispensable? He cited one poll in which 3/4 of phone owners said they were never more than five feet away from their devices, while 12% admitted bringing their phones into the shower with them. That is an image I could have done without.)
You can actually learn more about a person by examining their phone, Roberts said, than you can in “the most exhaustive search” of a house.
“A phone not only contains in digital form many sensitive records previously found in the home; it also contains a broad array of private information never found in a home in any form,” he wrote — unless a smartphone is also found in the home.
Giving police officers access to a person’s apps — Roberts said the average user has 33 — gives them the ability to create “a revealing montage” of a subject’s life.
...
The court recognized that its ruling may impose a burden on law enforcement officers at the time of an arrest. But, as Roberts pointed out, technological advances cut both ways.
In some jurisdictions, he said, police officers can email warrant requests to judges’ iPads, and judges, for their part, have been known to sign warrants and email them back to officers in less than 15 minutes.
Not surprisingly, the ruling has prompted a great deal of speculation over what it could mean for pending lawsuits against the NSA. But if you were expecting that this ruling suggests the the Supreme Court if poised to rule against, say, the NSA collection of metadata you might be disappointed:
Politico
SCOTUS cellphone ruling resonates in NSA fightBy JOSH GERSTEIN | 6/25/14 8:15 PM EDT
The Supreme Court’s blunt and unequivocal decision Wednesday giving Americans strong protection against arrest-related searches of their cell phones could also give a boost to lawsuits challenging the National Security Agency’s vast collection of phone call data.
Chief Justice John Roberts’s 28-page paean to digital privacy was like music to the ears of critics of the NSA’s metadata program, which sweeps up details on billions of calls and searches them for possible links to terrorist plots.
“This is a remarkably strong affirmation of privacy rights in a digital age,” said Marc Rotenberg of the Electronic Privacy Information Center. “The court found that digital data is different and that has constitutional significance, particularly in the realm of [the] Fourth Amendment…I think it also signals the end of the NSA program.”
...
For the NSA debate, the most significant idea in the court’s Wednesday opinion may be the notion that scale matters. Roberts and his colleagues soundly rejected arguments from the Obama administration that because police can search a few printed photographs found in someone’s wallet, officers were free to search thousands of images and the troves of other personal data contained on a typical smartphone.
...
“It’s very important that the court is recognizing that quantity matters,” said Georgia Tech professor Peter Swire, a privacy expert and member of a panel President Barack Obama set up to review the NSA’s call metadata program. “The court has said that quantity matters when it comes to the content of cell phones. And I believe the court will feel the same way when it comes to massive databases of telephone calls or computer communications.”
A former cybercrime prosecutor said the justices also seemed to recognize that scale of the collection not only gives the government more data, but also the ability to be much more intrusive than in earlier eras.
“The distinction here is more than just the capacity of the device to hold pictures,” said Alex Southwell, now with law firm Gibson, Dunn & Crutcher. “A cell phone is orders of magnitude different, not just in terms of numbers of items held but also in terms of the intrusiveness if searched. The mosaic of information available from seeing the whole of the data is transformative, just like the call records at issue in the NSA program.”
The Supreme Court’s ruling Wednesday in Riley v. California doesn’t say anything explicitly about the NSA’s metadata, nor did the justices mention national security concerns or intelligence gathering.
However, in one somewhat opaque footnote to Roberts’s majority opinion, the justices seem to be saying they are leaving the issue of bulk collection of data for another day. “These cases do not implicate the question whether [sic] the collection or inspection of aggregated digital information amounts to a search under other circumstances,” Roberts wrote.
Even if the justices were to deem the NSA program a warrantless search that goes well beyond tracing calls made on a specific phone line, that wouldn’t mean the terrorism-focused effort is unconstitutional. Instead, the court would have to consider whether the search is reasonable in light of the national security and public safety concerns involved — and justices are often extraordinary deferential to such arguments.
...
Analysts on both sides said the cell phone ruling is not a one-off, but seems to be part of a pattern of the court’s efforts to square privacy rights with the new challenges posed by emerging technology. Two years ago, in U.S. v. Jones, the justices rejected arguments that GPS tracking should not require a warrant because police have always been free to follow suspects around without getting one.
“What’s significant…is the justices, like the rest of us, are fully alive to the fact that technology is generating large quantities of data about us and putting it in places where it didn’t used to be,” Baker said.
President Barack Obama initially dismissed the privacy impact of the metadata program as “modest,” but in recent months he has acknowledged that it is troubling to many Americans. Earlier this year, he proposed shutting down the NSA program and replacing it with one in which telephone companies store the call information and make it readily available for the government to search. The president also implemented a procedure in which a judge approves most queries in advance, but the standard is lower than that for a search warrant.
The Obama administration has made much of safeguards it has imposed on the NSA program. However, the court’s cell phone search opinion suggests the justices might not find such self-regulation sufficient to address privacy concerns.
“The Government proposes that law enforcement agencies ‘develop protocols to address’ concerns raised by cloud computing,” the chief justice wrote. “Probably a good idea, but the Founders did not fight a revolution to gain the right to government agency protocols.”
...
As the article indicates, while it’s unclear how directly this ruling by the Supreme Court could impact rulings on bulk metadata collection, observers on all sides agree that this cell phone ruling “is not a one-off , but seems to be part of a pattern of the court’s efforts to square privacy rights with the new challenges posed by emerging technology”. And that’s good news because, at the end of the day, the only real solution to these increasingly difficult issues of balancing privacy and security in an ever changing technological landscape is a never ending cycle of court cases, legislation, and lots and lots of people spending time to really think thought the implications how we progress through the Information Age.
But as the article also highlights, it’s unclear from this ruling which way the court is leaning on the issue of bulk metadata collection because, as Chief Justice Roberts put it, “these cases do not implicate the question whether [sic] the collection or inspection of aggregated digital information amounts to a search under other circumstances,” while also asserting that “the Government proposes that law enforcement agencies ‘develop protocols to address’ concerns raised by cloud computing...Probably a good idea, but the Founders did not fight a revolution to gain the right to government agency protocols.”. What Chief Justice Roberts appears to be alluding to is the idea that addressing issues like this can’t be handled by self-regulations and protocols alone and that seems to suggest that Roberts is of the opinion that in order to balance the privacy and security (in a age where cell phones might hold more personal information about you than the contents of your home) we’re probably going to need a policy solutions and a technological solutions. And he’s quite right. When technology creates new legal conundrums, a look at changing the technology or changing how it’s used is clearly part of the solution.
What Would Snowden and the Cypherpunks Say?
But, of course, it’s also worth pointing out that simply saying “we need policy solutions and technology solutions” is a lot easier said than done. For instance, take Edward Snowden’s “policy + technology” solutions that he has consistently recommended to global audience. As Snowden puts it, we need policy solutions but we also need technology solutions like unbreakable end-to-end encryption and the use of systems like TOR to ensure that bulk data collection becomes impossible:
The Inquirer
Edward Snowden wants easy to use encryption everywhere
Community must do more
By Dave Neal
Mon Mar 10 2014, 18:0SURVEILLANCE WHISTLEBLOWER Edward Snowden has taken part in a video conversation at the South By Southwest (SXSW) conference and called for more accessible encryption tools.
The subject of the conversation, which was hosted by the American Civil Liberties Union, was whether communications are secure and if they can be trusted. They can, said Snowden, but only with some third party help and the use of end to end, machine to machine encryption.
The use of strong encryption is key and the panel agreed that Snowden’s revelations have improved the security landscape. The whistleblower said that technology companies need to help make encryption more accessible and less complex. “Encryption does work,” he said, calling it “the defence against the dark arts for the digital realm.”
Snowden said that the US National Security Agency (NSA) has created an “adversarial internet”. He added that while policy changes are needed, technological changes will be the most effective.
“[We must] craft solutions that are safe”, he said. “End to end encryption makes bulk surveillance impossible. There is more oversight, and they won’t be able to pitch exploits at every computer in the world without getting caught.”
...
As Snowden said, “End to end encryption makes bulk surveillance impossible. There is more oversight, and they won’t be able to pitch exploits at every computer in the world without getting caught.” So, if Snowden is correct, we can simply develop easy-to-use unbreakable encryption technology and bulk surveillance will be made impossible and therefore all surveillance will be forced to shift towards targeted surveillance where “there is more oversight”. No more bulk surveillance but still room for targeted surveillance. Problem solved, right?
Well, if the elimination of bulk data collection is something that society wants to prioritize then, yes, strong end-to-end encryption and the use of tools like TOR (because strong encryption still won’t actually hide all the metadata, you’d need something like TOR) would indeed force surveillance to become much more targeted. Assuming a spywarepocalypse doesn’t take place.
But what about that targeted surveillance that Snowden claims to support? Will that still be possible once strong end-to-end encryption tools are made widely available? Well, here’s where it get messy in ways that Snowden and the Cypherpunks don’t like to talk about and in ways that relate to the Supreme Court’s recent cellphone ruling: Once you have easy-to-use strong encryption tools that make communications unbreakable, it’s probably not going to take too long before similar tools (or the very same tools) are also used make the local files on your computer strongly encrypted too. That means that when there’s a legitimate law enforcement or national security need to view the contents of someone’s computer or smartphone, a warrant won’t be enough. The person under investigation is simply going to have to decrypt the software or hand over a password under threat of contempt of court. And when law enforcement has to rely on the person being investigated to provide access to incriminating evidence, it means we might be seeing a lot more 5th amendment stories like this:
ExtremeTech
US Appeals court upholds Fifth Amendment right to not decrypt hard drivesBy Joel Hruska on February 24, 2012 at 1:31 pm
The 11th Circuit Appeals Court has issued an important ruling on the question of whether or not a defendant can be forced to decrypt a hard drive when its contents could provide additional incriminating evidence. The case in question refers to the actions of a John Doe who was compelled to testify before a grand jury in exchange for immunity from prosecution. Doe was ordered to decrypt the contents of his laptop as part of that testimony, but was told that his immunity would not extend to the derivative use of such material as evidence against him. Doe refused to decrypt the TrueCrypt-locked drives, claiming that to do so would violate his Fifth Amendment right against self-incrimination.
...
Note that this case involves the use of TrueCrypt, one of the tools used by Snowden to encrypt his NSA documents that he strongly advocates (before it mysteriously shut down about a week before the Heartbleed revelations ). Not only can TrueCrypt encrypt data in ways that the NSA can’t break, but it also allows you to create hidden volumes within your encrypted volumes so if you are asked to hand over the password you can simply give the “fake” top-layer password that only decrypts the non-hidden folders.
Continuing...
...
The 11th Circuit’s ruling reverses the lower court’s decision to hold Doe in contempt and affirms that forcing him to decrypt the drives would be unlawful. It also states that the district court erred in limiting the immunity it granted Doe to only apply to grand jury testimony and not the derivative use of the evidence in question. The ruling on misapplied immunity means that the 11th Circuit could’ve punted on the Fifth Amendment issue, but the court opted not to do so.The applicability of the Fifth Amendment rests on the question of what the government knew and how it knew it. Federal prosecutors admitted at trial that while the amount of storage encrypted exceeded 5TB, there was no way to determine what data was on the hard drive — indeed, if there was any data whatsoever. Plaintiffs were reduced to holding up numerical printouts of encryption code that they said “represented” the data they wanted, but were forced to admit that there was no way to differentiate what might be illegal material vs. legal.
The question at hand is whether or not decrypting the contents of a laptop drive is testimony or simply the transfer of existent information. The court acknowledges that the drive’s files are not testimony of themselves, but writes “What is at issue is whether the act of production may have some testimonial quality sufficient to trigger Fifth Amendment protection when the production explicitly or implicitly conveys some statement of fact.” (emphasis original)
Previous court cases have established that merely compelling a physical act, such as requiring a defendant to provide the key to a safe, is not testimonial. Actions are also non-testimonial if the government can invoke the “foregone conclusion” doctrine by showing with “reasonable particularity” that it already knew that certain materials or content existed.
By decrypting the drives, Doe is admitting “his knowledge of the existence and location of potentially incriminating files; of his possession, control, and access to the encrypted portions of the drives; and of his capability to decrypt the files.” The court dismisses the argument that the contents of Doe’s hard drives are a foregone conclusion, noting that “Nothing… reveals that the Government knew whether any files exist or the location of those files on the hard drives; what’s more, nothing in the record illustrates that the Government knew with reasonable particularity that Doe was even capable of accessing the encrypted portions of the drives.”
“The Government has not shown, however, that the drives actually contain any files, nor has it shown which of the estimated twenty million files the drives are capable of holding may prove useful… we are not persuaded by the suggestion that simply because the devices were encrypted necessarily means that Doe was trying to hide something. Just as a vault is capable of storing mountains of incriminating documents, that alone does not mean that it contains incriminating documents, or anything at all.”
Not exactly carte blanche
The strength of this decision is the balance it strikes between the rights of the government and the individual. Rather than focusing on the nature of the pass phrase defendants are ordered to provide, it emphasizes the issue of what the prosecution knows and how it learned it. If the prosecutors had had sufficient data to indicate that illegal materials were pstored on Doe’s hard drives, forcing him to testify would’ve been valid under the foregone conclusion principle.
...
This decision doesn’t make it impossible for the government to use the contents of an encrypted drive, but it requires that the prosecution demonstrate a knowledge of the contents and data contained therein before being allowed to issue a blanket demand. It’s a fair call, and given the increasing number of similar cases, an important one.
There’s a lot to digest there: Ok, so it appears that “John Doe” was staying in a hotel room with an internet IP addressed that was caught accessing child porn over YouTube. But it wasn’t the only hotel room with that IP address so it couldn’t be specifically tied to his computer. The prosecutors offer him immunity for his testimony if he decrypts the TrueCrypt-encrypted files on his computer but they don’t offer him immunity for the “derivative use of such material as evidence against him”. So Doe refuses to decrypt the drive, citing the 5th amendment right against self incrimination. And 11th Circuit Appeals Court argued that:
...
By decrypting the drives, Doe is admitting “his knowledge of the existence and location of potentially incriminating files; of his possession, control, and access to the encrypted portions of the drives; and of his capability to decrypt the files.The court dismisses the argument that the contents of Doe’s hard drives are a foregone conclusion, noting that “Nothing… reveals that the Government knew whether any files exist or the location of those files on the hard drives; what’s more, nothing in the record illustrates that the Government knew with reasonable particularity that Doe was even capable of accessing the encrypted portions of the drives.”
...
In other words, the 11th Circuit appeals court ruled that providing the decryption key is basically a testimony that says “yes, I have access to those files” and thus constitutes a self-incriminating testimony when the government couldn’t actually provide evidence that they knew any incriminating evidence was on the drive (since multiple hotel rooms shared the same IP). If this seems like a stretch, keep in mind that it’s entirely possible for someone to possess a computer or smartphone that contains encrypted files that someone else put there and controls.
Is Encryption Like a Strongbox or a Wall Safe? Who Cares? The Courts
Also keep in mind that the Supreme Court has yet to rule on this case or similar cases, so a very big Supreme Court ruling on forced decryption is just a matter of time:
DuqCrim.com
Criminal Justice Program of Duqesne University School of LawThe catch 22 of forced decryption.
Posted by Frank Spinelli on May 7, 2014 at 7:14 AMShould forced decryption of a hard drive be prohibited under the Fifth Amendment?
Some background: In cryptography, encryption is the process of encoding messages or information in such a way that only authorized parties can read it. Encryption has been around for a very long time, and has historically been used frequently during wartime.
...
Meanwhile, the Fifth Amendment states that no person, “shall be compelled in any criminal case to be a witness against himself.” The Fifth Amendment is designed to prevent the accused from being forced to divulge incriminating evidence from within his or her own mind, to be used against him or her self. A person may invoke the Fifth Amendment once three factor have been established: compulsion, a testimonial communication or act, and incrimination. The law also requires that the information sought still retain testimonial value, and consequently be worth being constitutionally protected. The information sought out cannot already be a forgone conclusion, which the Government already concretely knows, or has proven exists by independent means.
Compulsion, and incrimination are relatively straightforward where an accused is asked by a court to decrypt a hard drive.
The court is compelling the accused to divulge the contents that are encrypted in one of two ways. Firstly, by either decrypting the information by providing the password required to decrypt the information, enabling authorities to do just the same. Or, secondly, by providing the information sought, in a decrypted and intelligible form.
Incrimination merely refers to the fact that the information sought to be gained, and compelled to be revealed by the accused, is in fact incriminating.
The issue that is currently undecided is whether or not the act of production, or enabling the decrypting, is testimonial, and whether or not the testimonial status extends beyond the act of decrypting, to the actual contents revealed, or decrypted.l
The supreme court has yet to rule on this issue. The highest court to rule on the issue has provided some interesting insight regarding the issue. The Eleventh Circuit has held that an accused may not be forced to decrypt the files on an encrypted hard drive, due to the nature of encryption.
The court explained that whether an act is testimonial, and is covered by the protections of invoking the Fifth Amendment, or merely a compelled physical act, which remains unprotected by the Fifth amendment, can be best analogized to the difference between a strongbox and a wall safe. The court relied on previous Supreme Court decisions concerning the Fifth Amendment, pointing out that the forced production of a physical key to a strong box would not generally considered to be a testimonial act. Whereas, the forced production of a combination to a wall safe would be considered a protected testimonial communication or act, as it requires an accused to reveal a truth from within his or her mind. The revelation of which would lead to the production of incriminating evidence, from within the wall safe, or at least support a link in the chain of evidence, strengthening the case against the accused. Something that Fifth Amendment was specifically added to the bill of rights to protect against.
For example, in regards to the previously mentioned historical events, hypothetically, an accused person would be unable to invoke the Fifth Amendment in a case where a court issued a subpena forcing the production of an enigma machine to decrypt a file. This would be analogous to the physical key in the strongbox analogy, because the act of producing the enigma machine, would be requiring a physical act. However, if a court issued a subpoena forcing an accused person, fluent in Navajo and English, to reveal the contents of a file, written in Navajo, it would likely be considered to be a testimonial act, and protected under the invocation of the Fifth Amendment. The second subpoena requires the accused to reveal encrypted information by utilizing a mental skill, and essentially compel the production of encrypted, and incriminating evidence from within his or her mind.
Furthermore, because of the nature of encryption, the “foregone conclusion” doctrine is generally inapplicable to information sought, unless corroborated from other evidence, or non-encrypted data on the drive. This is simply because, as the court pointed out, until a hard drive is decrypted it is usually extremely difficult to tell what type of file, or files, if any, are being stored on a hard drive until it is decrypted. Consequently, it is generally not a “forgone conclusion,” since it is difficult to tell if an encrypted hard drive contains zero data, or is filled completely with encrypted data, as empty space and recorded data appear generally the same before decryption. The court therefore reasoned that the decrypted information should also be protected, not just the act of production of the password, but the decrypted data as well.
Consequently, a broader grant of immunity would have to be granted, one which extended the data eventually decrypted, not just the act of production, before a court may compel an accused to decrypt data.
The issue remains unclear for now in the other circuits, and most states, until the Supreme Court hears a case concerning this issue, and rules decisively on it.
...
“The issue remains unclear for now in the other circuits, and most states, until the Supreme Court hears a case concerning this issue, and rules decisively on it.” Yep, the issue does remain unclear. But if the Supreme Court is poised to issue a series of rulings on privacy-related issues it seems pretty likely that we’re going to see a ruling on this topic of forced decryption pretty soon because the growth in both the number and popularity of encryption tools means 5th amendment fights over forced decryption are only going become increasingly frequent. And that means the “Strongbox vs Wall safe” debate is going to become quite a hot topic because, as groups like the Cypherpunk-leaning Electronic Frontier Foundation (EFF) and the ACLU argued last October, if you’re ever forced to decrypt your data it is clearly a “wall safe” and not a “strongbox” scenario and therefore you should get blanket immunity for anything found:
Threatpost
EFF Makes Case That Fifth Amendment Protects Against Compelled Decryption
by Michael Mimoso
October 31, 2013 , 2:08 pmWith new leaks about the extent of U.S. government surveillance coming almost daily, one constant remains among all the deterrents to the NSA’s prying eyes: encryption technology works. As far as we know, the math behind encryption is solid, despite the specter of some unnamed breakthrough made by the spy agency some years ago.
...
Tangentially, the government continues to try to make a case for the ability to force someone alleged to have committed a crime to decrypt their hard drives and turn over evidence. On a number of previous occasions, the courts have upheld Fifth Amendment protections against self-incrimination in such cases.
In a case starting on Monday in Massachusetts Supreme Judicial Court, an appeal of a previous decision against Leon Gelfgatt, 49, of Marblehead, Mass., an attorney, was indicted in a mortgage fraud scam in which he is alleged to have stolen more than $1.3 million. The government, in trying to make its case against Gelfgatt, tried to compel him to decrypt his hard drive. The judge in the case, however, denied the request saying that such an action would violate the Fifth Amendment.
Digital advocacy group the Electronic Frontier Foundation, along with the American Civil Liberties Union, filed an amicus brief yesterday explaining the Fifth Amendment privilege against self-incrimination prohibits compelled decryption. Hanni Fakhoury, staff attorney with the EFF, wrote in a blogpost that the Fifth Amendment protects an individual from unveiling the “contents of his mind” and that the government through this action would be learning new facts in the case beyond the encryption key.
“By forcing Gelfgatt to translate the encrypted data it cannot read into a readable format, it would be learning what the unencrypted data was (and whether any data existed),” Fakhoury wrote. “Plus, the government would learn perhaps the most crucial of facts: that Gelfgatt had access to and dominion and control of files on the devices.”
The government’s argument is that the decryption is akin to providing the combination to unlock a safe, rather than compelling the production of decrypted files.
“That assertion is incorrect,” the brief says. “Just as encrypting a drive encrypts each and every one of its files, decrypting the drive makes available copies of all of its files.” The contention is that because the data is transformed and scrambled, decryption is more than a key, safe combination or password, the brief said.
...
“In the surveillance environment, the need for encryption is especially strong because it often seems that strong technology is our last refuge from the government’s prying eyes,” Fakhoury said. “We’ve seen in all the leaks the government’s effort to undermine web encryption and so we must make sure they can’t undermine the physical device encryption here.”
So in this case involving $1.3 million stolen through mortgage fraud, the government tried to compel the defendant to decrypt his data by arguing that decryption is analogous to a handing over a key to a strongbox. But the EFF and ACLU assert the opposite, that decryption is an act of revealing a piece of your inner mind and therefore protected by the 5th Amendment. So when the Supreme Court eventually rules in this topic, THAT’s one of the key legal distinctions it’s going to have to resolve: Is encryption like a strongbox or a wall safe? Welcome to the fun world of unbreakable encryption and legal right.
The Massachusetts Supreme Court Ruled on that $1.3 million mortgage fraud case just days ago. In that instance, the court found, the government could compel decryption. Why? Well, basically because the person under investigation told the police that he could indeed decrypt the data, but he won’t. So, in this case, court ordered forced decryption was deemed constitution. But that’s just for Massachusetts. Until the US Supreme Court rules on this topic, the constitutionality of forced decryption will depend on not only your legal circumstances, but also your locale:
Ars technica
Massachusetts high court orders suspect to decrypt his computers
Suspect told cops: “Everything is encrypted and no one is going to get to it.”by Cyrus Farivar — June 25 2014, 7:00pm CST
Massachusetts’ top court ruled, in a 5–2 decision on Wednesday, that a criminal suspect can be ordered to decrypt his seized computer.
The Massachusetts Supreme Judicial Court (MSJC) ruling only applies to the state. Various other courts at the state and federal level have disagreed as to whether being forced to type in a decryption password is a violation of the Fifth Amendment right to protect against self-incrimination and its state equivalents (such as Article Twelve of the Massachusetts Declaration of Rights). For example, more than two years ago, the 11th Circuit Court of ruled ruled that a defendant was not obliged to decrypt his hard drive, as doing so would violate his Fifth Amendment rights. However, that ruling only took effect in the 11th Circuit, which covers parts of the southeastern United States. Just last year, a federal judge refused to force a Wisconsin child pornography suspect to decrypt his laptop. Overall, cases involving decryption are still relatively new and rare. The first known one only dates back to 2007.
Privacy advocates lamented the MSJC’s new ruling, disagreeing with the court’s judgment that an exception to the Fifth Amendment rule, such as a “foregone conclusion,” applies here.
“The defendant is only telling the government what it already knows”
horities that he was able to decrypt his computers but would not do so.
As the MSJC ruled:
During his postarrest interview with State police Trooper Patrick M. Johnson, the defendant stated that he had performed real estate work for Baylor Holdings, which he understood to be a financial services company. He explained that his communications with this company, which purportedly was owned by Russian individuals, were highly encrypted because, according to the defendant, “[that] is how Russians do business.” The defendant informed Trooper Johnson that he had more than one computer at his home, that the program for communicating with Baylor Holdings was installed on a laptop, and that “[e]verything is encrypted and no one is going to get to it.” The defendant acknowledged that he was able to perform decryption. Further, and most significantly, the defendant said that because of encryption, the police were “not going to get to any of [his] computers,” thereby implying that all of them were encrypted.
When considering the entirety of the defendant’s interview with Trooper Johnson, it is apparent that the defendant was engaged in real estate transactions involving Baylor Holdings, that he used his computers to allegedly communicate with its purported owners, that the information on all of his computers pertaining to these transactions was encrypted, and that he had the ability to decrypt the files and documents. The facts that would be conveyed by the defendant through his act of decryption—his ownership and control of the computers and their contents, knowledge of the fact of encryption, and knowledge of the encryption key—already are known to the government and, thus, are a “foregone conclusion.” The Commonwealth’s motion to compel decryption does not violate the defendant’s rights under the Fifth Amendment because the defendant is only telling the government what it already knows.
A step back for privacy
Because Gelfgatt already admitted to police that he owned and controlled the seized computers and had the ability to decrypt them, the court found that the act of decryption would not reveal anything new to the police. Therefore, the act of compelled decryption was not “testimonial.” Normally, the Fifth Amendment privilege prevents the government from forcing a witness to disclose incriminating information in his mind (like a password not written down anywhere else)—but only if that is information the police do not already know.
Jessie Rossman, an attorney with the American Civil Liberties Union of Massachusetts, told Ars that her organization is “disappointed in the decision.”
“For example, an individual can be forced to hand over a key to a locked safe if the government already knows that’s your safe—the documents in there have already been created,” she said.
“Your opening that safe, the documents are already there. That’s not new testimonial. But encrypted data needs to be transformed into something new when decrypted. A number of encrypted technology works such that when you look at [a hard drive] you can’t even tell what is empty space or what is not empty space. When you decrypt that computer it’s creating something new and if you didn’t have any knowledge, the act of decrypting tells you something you didn’t know beforehand. We believe that the Fifth Amendment and Article 12 needs to protect not only the act of entering a code but the act of producing decrypted files to the government.”
...
Fred Cate, a law professor at Indiana University, told Ars that this ruling could come with an unfortunate consequence. If someone admits to owning a computer and asserts that they possess the password, “its only likely effect is to encourage future defendants to be less forthcoming with police.”
“This seems to be an issue likely to head to the Supreme Court where, despitetoday’s sweeping 9–0 victory for privacy involving searches of cellphones, the outcome is not at all certain,” he added. “Historically, the high court has taken a dim view of efforts to expand the Fifth Amendment privilege against self-incrimination or to apply it in novel ways. In the meantime, we should expect to see both federal and state courts continuing to reach divergent results when faced with this important question.”
As suggested at the end, “this seems to be an issue likely to head to the Supreme Court where, despitetoday’s sweeping 9–0 victory for privacy involving searches of cellphones, the outcome is not at all certain.” Should that uncertainty be surprising? Well, we aren’t just looking at the emergence of a new technological phenomena (pocket-sized computers) requiring a review of 4th amendment right. We’re really looking at the intersection of two intertwined technologies. Until the last decade or so, you didn’t have people carrying around a home’s worth of personally revealing (and potentially incriminating) information in your pocket. And yet, as the article points out, pre-2007 we didn’t really see cases involve court-forced decryption where which is to be expected since strong encryption is notoriously non-user-friendly. And the Supreme Court’s recent ruling on the 4th Amendment didn’t really address the issue of forced decryption at all, so yes, quite a bit of uncertainty should be probably be expected in the area.
At the same time, notice the overwhelmingly negative responses to this Massachusetts Supreme Court ruling by groups like the ACLU and EFF even when the defendant basically tells the police that, yes, the encrypted drives are his and, yes, he can decrypt them. So one thing we can probably be pretty sure of is that this issue is going to be contentious for a long long time and the debate over forced encryption is only going to grow. In situations like this where there isn’t a clear ‘right’ and ‘wrong’ but instead a difficult balancing of priorities, a drawn out fight is pretty much guaranteed.
So get ready for more Supreme Court rulings on these topics. But also get ready for more confusing debates over “what did the government know and when did they know it” and a far more detailed examination of the distinctions between strongboxes and wall safes than you ever expected to endure. Is decryption “an act of production” warranting 5th Amendment protections or just “a physical act”? We’ll find out!
But the fact that these strangely nuanced legal distinction have to be made in the first place is actually a great example of the system working. Life is complex and the law should reflect that complexity. And as technology progresses those complexities are only going to grow so this is the kind of legal morass that we should be somewhat pleased to see emerging. That legal morass is a reflection of a reality morass and it has to be tackled. Tackled over and over as technology changes. But that legal morass is also a strong reminder that the privacy, security, and ever-changing technology is far more complex than the version of reality presented by Edward Snowden and his allies like the EFF.
Much of the accolades given to the Supreme Court’s recent ruling is about how it formalized a recognition that the scale of technology can qualitatively change its nature and necessitate a legal rebalancing of privacy and security. The simple cellphones of yesteryear are quite different from the smartphones of today. As the Supreme Court put it, searching someone’s cellphone might be more informative than searching their home. That’s an important recognition because if technology suddenly allows us all to walk around with a home’s worth of personal information in our pockets we probably don’t want to allow full access to that when someone is simply under arrest. But as we saw with tools like TrueCrypt, if our smartphones are homes, they’re increasingly homes that cannot be entered at all by law enforcement without the permission of the home owner regardless of circumstance because it will be mathematically impossible (and maybe physically impossible someday).
If a court issues a warrant to allow a search of your home, someone is going to search your home whether want to let them in or not. Physically impenetrable homes aren’t physically possible. But impenetrable smartphones via encryption, on the other hand, are now being aggressively developed and promoted (by Germany) in the post-Snowden era for use by the masses (although they’ll still presumably be hackable by the BND or whichever government sponsors them).
Sure, you can still be sent to jail for contempt of court if you refuse to comply with a valid court order to decrypt, but that just means that the jail time for contempt of court could now suddenly become a much more available legal option in a growing number of cases for people facing far more serious crimes. And don’t forget that people can be assigned the role of the data mule or data ‘fall guy’ in a larger criminal organization. That might be a lot easier to do going forward. We should still prioritize protecting our 4th Amendment rights, but we should also recognize the new real costs that arise when protecting them as we’re forced to adapt those legal protecting to changing technological landscapes. Strong encryption is an incredibly useful tool, for good or ill. And that means strong encryption is going to lead to new costs in protecting those rights at the same time that it’s being used in helpful ways. It is what it is.
Beware of Libertarians Bearing Non-Solutions
So let’s be relieved that the Supreme Court is intent on tacking the increasingly complex issues surrounding privacy, security, and technology because the legal ambiguity on these issues is only going to grow. Unbreakable encryption is just a matter of time because it already exists. Edward Snowden may have dramatically accelerated strong encryption’s adoption, but it was just a matter of time before some encryption “killer app” brought strong encryption for both data transmissions and local data storage to the masses. These super-encryption tools were already growing in popularity long before Snowden came along and turned the global focus onto them. Some sort of legal clarity was going to be necessary sooner or later.
And let’s also be relieved that the recent 4th amendment ruling signifies that the Supreme Court justices are keenly aware that changes in the scope and capacity of technology can necessitates significant rethinking in how society establishes the rules and safeguards for both the technology itself and that ever-changing technology interfaces with our never-changing human situation of all having to live together under uniform set of laws. It was a great ruling on the 4th that was overdue.
But with tools like TrueCrypt and Tor becoming increasingly popular, let’s not be relieved about the fact that folks like Edward Snowden, Julian Assange, Jacob Appelbaum, and the rest of Cypherpunk/Cyberlibertarian movement have largely seized control of the international debates over these issues. Balancing privacy, security, and technology is tough enough as is and it’s only going to get more and more complicated. That’s why you don’t want extremist ideologies dominating the debate. The Cypherpunks make many valid points when highlighting the dangers of a creeping technology-enabled surveillance states (it’s not hard). But Snowden and the Cypherpunks also casually dismiss or ignore the darker implications of the solutions they suggest.
If society wants to go down the path of adopting ubiquitous unbreakable encryption and tools that allow for layers and layers of “hidden volumes” along with generous 5th Amendments interpretations that give blanket immunity for forced decryption, well, ok, society should have the right to go down that path. And it might even be the best path overall. We’ll find out because it’s kind of inevitable that super encryption goes mainstream. But we should at least be trying to predict the negative implications that come with going down that path and you don’t see any real attempts to do that by the movements that are currently dominating the global debate. That’s precarious.
It’s true that Edward Snowden and the Cypherpunks says things like “not all spying is bad” and things like “we need both policy solutions and technical solution”, but that’s about it. The rest of what he’s been advocating is largely a Cyperpunk agenda that makes policy solutions moot. Let’s take another quick look at Snowden’s suggestions at the SXSW festival:
Wired
Edward Snowden Urges SXSW Crowd to Thwart NSA With TechnologyBy Kim Zetter
03.10.14 |
3:48 pmWith lawmakers slow to pass legislation curbing NSA surveillance, it’s up to the technology community to step in and devise solutions that will better protect online communications from snoops, said Edward Snowden, speaking today from Moscow at the South by Southwest conference in Austin.
“[T]he people who are in the room at Austin right now, they’re the folks who can really fix things, who can enforce our rights for technical standards even when Congress hasn’t yet gotten to the point of creating legislation that protect our rights in the same manner…,” he said. “There’s a policy response that needs to occur, but there’s also a technical response that needs to occur. And it’s the makers, the thinkers, the developing community that can really craft those solutions to make sure we’re safe.”
The massive surveillance being done by the NSA and other governments has created “an adversarial internet,” he said, “a sort of a global free-fire zone for governments, that’s nothing that we ever asked [for]; it’s not what we wanted. It’s something we need to protect against….
“[T]hey’re setting fire to the future of the internet. And the people who are in this room now, you guys are all the firefighters. And we need you to help us fix this.”
One solution he highlighted, that would make it more difficult for the U.S. and other governments to conduct passive surveillance, is the implementation of end-to-end encryption that would protect communications from user to user, rather than as it’s currently done by Google and other services, which only encrypt the communication from user to service, leaving it vulnerable to collection from the service provider.
“End-to-end encryption … makes mass surveillance impossible at the network level,” he says, and provides a more constitutionally protected model of surveillance, because it forces the government to target the endpoints — the individual users — through hacking, rather than conduct mass collection.
...
“End-to-end encryption … makes mass surveillance impossible at the network level,” he says, and provides a more constitutionally protected model of surveillance, because it forces the government to target the endpoints — the individual users — through hacking, rather than conduct mass collection.”
That’s the claim made over and over by Snowden: if we just all implement end-to-end strong encryption than the government will just target individual users “through hacking”. So it will be harder for the government to spy on individuals, but not impossible. But as we’ve seen, there’s really no way to “hack” strongly-encrypted locally stored data. Especially if it’s in a hidden volume that can’t be detected. And then there’s the fact that much of Snowden’s leaks have revealed have been targeted surveillance methods.
Snowden’s words have enormous influence on these topics and, unfortunately, that means the global policy debate that needs to emerge in response to ubiquitous super encryption technology is starting off in a warped manner. We get endless debates over whether or not metadata collection helps stop ‘terror’ and yet, as we also saw above, it wasn’t terrorism that people were using strong encryption to carry out. It was everyday crimes. This isn’t just about terrorism and the abuse of government power.
So we really have to keep asking ourselves if the anti-NSA backlash is going to used by folks with a libertarian agenda to weaken the government in ways that go far beyond bulk surveillance. If we accept the the libertarian assumption that government simply can’t work, the kind of balance eventually struck on issues like the 4th and 5th amendments may results in the kind of society where things like legitimate law enforcement increasingly can’t work too. Is that part of the agenda? It sure would fit the current anti-government fever afflicting an increasingly far-right GOP. Just imagine the kinds of corporate abuses that could be enabled with end-to-end encryption, “hidden volumes”, and the kind of 5th Amendment interpretation that basically views any forced decryption as a violation of the 5th Amendment.
These lurking dangers are one of the reasons why the Supreme Court’s 4th Amendment ruling was great but it was also only part of the overall solution to balancing privacy and security in this currently technological environment. Now that strong encryption for the masses is becoming a reality, a 5th Amendment ruling on forced decryption is going to be needed too before we can really assess to the new legal landscape. And as we saw above, that’s not an easy or obvious ruling...not nearly as easy as this 4th amendment case. In fact, it looks pretty difficult. Is encryption like a strongbox or wall safe? What a strange concept to have legal immunity hinge upon.
But another reason we need to be on guard against an anti-NSA backlash morphing into an attack on the legitimacy of government is because the ‘Little Brother’ surveillance state that everyone wants to live in — and it’s not just libertarians desire that — might require a ‘Big Helpful Brother’ government for fixing the kinds of big problems that don’t get fixed on their own or by “the market” or charity. And that means *gasp* building a government you can trust and that’s empowered to get things done! Not the libertarian vision of a government that you can trust because it’s been systematically disempowered, but a real democratically elected government that doesn’t accept poverty or oppression in any form and doesn’t simply wait for the private sector to fix those problems.
We can’t rely on technology as shield against bad policy or bad governments. If we’re going to get serious about addressing the weird and ever more exotic threats facing for society one of the most powerful tools for protecting our privacy is, quite simply, a highly competent society. Competent in the sense that it’s a society that is actively engaged in learning about the threats around it, emerging and existing threats, while also being sane enough to deal with these threats in a manner that doesn’t lead to some sort of nightmare situation. That’s how we protect our privacy most effectively: by identifying and solving the kinds of openly visible problems like poverty and oppression that encourage individuals to secretly engage in terrorism or harmful crimes. There’s simply going to be less danger to look out for the more we make a better world.
But we’re not going to be able to build that competent society capable of helping if the only governments we can trust are those without the power to harm. Government, it turns out, is a lot like technology: Governments with the power to help can also hurt, just like technology. Powerful government aren’t inherently a “good” or “bad” thing, as the libertarians assert. It depends on how you use it. If you have a weak government, it may not directly harm you but it’s not going to help either. Just like technology. This is why ensuring that we don’t protect our rights at the expense of a competent helpful government is going to be increasingly important and challenging going forward. The simple fact that few entities are more empowered by technology than a government creates impulse to disempower government as a form of civic self-defense. And that impulse is only going to grow with each technological advanced that enhances that power. How we strike that balance between privacy and security without turning governments into either a beast or a worthless joke isn’t obvious. Maybe empowering criminals with super encryption tools and 5th Amendment rights is a reasonable price to pay to avoid the costs associated it government abuse? Or maybe it’ll foster a crime explosion? Maybe both. No matter which path is chosen we’ll see the consequences. Eventually. But we’re not going to see all of the other optional paths forward if the Cypherpunk/Libertarian perspective continues to be dominant perspective on these kinds of issues.
Enough With the Insane Insanity. Sane Insanity is Required
To some extent, if we really want to get serious about grappling with these mutually contradictory issues we, by definition, need to go somewhat insane in terms of our worldview. Insane in the sense that we really do need to hold multiple, mutually contradictory ideas in our minds simultaneously in order to grapple with them individually. Sane insanity. In other words, you can’t simply be a “privacy advocate” without being a “security advocate”. Privacy and security are intertwined because our lives our intertwined. I have to care about your security too if I really want to protect my privacy and vice versa.
But you also can’t achieve that intertwined state by simply defining “privacy=security”, as we often hear from folks like Snowden or Assange. That just doesn’t make sense when “privacy” includes super encryption and “hidden volumes” and legal regimes that can potentially provide an incredible shield against legitimate law enforcement or national security tasks. At the same time, because reality is somewhat insane we can’t kid ourselves about the incredible dangers that could potentially arise from technologically enabled mass surveillance, especially crypto-mass surveillance (the Panopticon). Sane insanity is needed on a variety of topics and that need is only going to grow.
Terrified of a government with the power to track us all? Great. It’s a healthy sense of terror. Governments can become criminal. But also be terrified of a government that can’t really track or prosecute criminals, even when it’s important. So embrace the cognitive dissonance that comes with these issues. Embracing the technology-enhanced cognitive dissonance and lack of easy and obvious answers is the answer. That’s how the kinds of long-term solutions we need are going to be found and it’s a lot better than the alternative.
This story is extremely complex. If I understand it correctly it boils down to; not every decision that sound good on its face is good. Also the powers that be are SOBs and the general public is screwed. I hope I am understanding correctly. Thank you.
@GK: Hehe, yeah that’s the gist of it, but with the added caveats that 1. the powers that be are both public and private, and 2. should we find that the powers that be are indeed SOBs that pose an unreasonable threat to rights privacy, the best defense the public has against those SOBs is replacing the public SOBs with non-SOBs that that can keep the public and private SOBs from trampling everyone’s rights. But yeah, since some degree of human judgement by people in positions of power is required for a modern society to function, judicious use of our main TPTB SOB management tool (democracy) is required.
@GK: Here’s another story that’s great example of an idea that might sound good on on face but may not be so great in practice: So Russia is following the recommendations by Germany’s and Brazil’s governments that they require companies like Google and Facebook to store their data locally. When it was Germany and Brazil making this pitch it was typically characterized as a way for those governments to protect their citizens from the prying eyes of the NSA. But when Russia actually passes such a law, we get reminded that data localization laws also make the data much more likely to get spied on by the local government and that might be a much bigger threat to your privacy the NSA. As is usual, the complexities of issues like this can remained obscured for a while but not necessarily forever:
This is interesting: Edward Snowden just endorsed a web service provider specifically because it doesn’t retain the capacity to decrypt your data even when provided with a valid warrant:
Following the attacks on Charlie Hebdo’s Paris office, UK Prime Minister David Cameron drew a number of responses with his call for legislation to force UK internet service providers to make their encrypted customer data available to UK law enforcement. They tended to be rather negative responses, which is understandable given the controversial nature of the request. But they apparently weren’t all negative:
Wow, it sounds like the EU’s counter-terrorism chief is calling for pretty exactly what David Cameron wants, plus he’d like to see the EU remake the very same data-retention laws that the EU parliament scrapped last year. And a new EU internet monitoring unit.
So it’ll be interesting to see how that proposal goes over well with the public. But it will also be interesting to see what Edwards Snowden has to say about these proposals. He’s obviously not going to be in favor of EU counter-terrorism chief’s recommendations, but he’s also on record saying things like “not all spying is bad”, so how harshly will he respond to the EU’s new plans? Might he call for the abolishment of intelligence agencies? It’s possible:
Well, that’s one way to prevent government spying: eliminate intelligence agencies altogether, because spying is something only developing countries engage in, and instead conduct any warrant-approved wiretapping through law enforcement agencies. The police can tap Putin’s phone after they get a warrant.
This proposal raises a number of fascinating questions, including Snowden’s views on the militarization of law enforcement. But it’s also worth noting that the Snowden’s views on this topic sound somewhat similar to the proposals of prominant security expert Bruce Schneier’s views, but not excatly the same, so it would be very interesting to see where they diverge. Transferring all domestic intelligence gathering to the FBI was something Schneier recommended last year. At the same time, Scheier was also advocating that all foreign cyberattacks and targeting hacking be conducted by the military and foreign spying be officially considered an offensive military act:
Keep in mind that when Bruce Schneier describes the targetting of individuals by the “Tailor Access Oprations” (TAO) group as “the best of the NSA and is exactly what we want it to do”, that’s the opposite of what Wikileaks-hacker Jacob Appelbaum was suggesting during his keynote address to the 2013 Chaos Computing Convention. Appelbaum, who has a large cache of Snowden Documents himself and written extensively about it in Der Spiegel, spent the entire talk showing one example after another of the TAO’s tools and discussed how horrible it was the the NSA these tools at their disposal because their existence means anyone could potential have them used against them.
This reflects a largely unspoken divide in the security community: Scheier seems to be acknowledging that targeted surveillance is fine, just not mass-surveillance. Appelbaum, on the other hand, appears to view targeted surveillance as effectively just as bad because, hey, they could target everyone, including the NSA’s capacity to arrange for computer manufacturers to target specific people’s computers with built in hardware or software changes to make only those computers vulnerable. Appelbaum is essentially a “no spying at all by governments or anyone” advocate.
The intertwined nature of targeted surveillance techniques and mass-surveillance capabilities in a world where everyone is using the same technology platforms but from different locations using different hardware. For instance, if you’re targeting someone or some group, you might need to have the capacity to intercept and analyze, at least at a meta-data level, a flood of data in order to find your targets’ communications. Appelbaum would clearly prefer no communications get targeted ever. It’s unclear how Schneier would reconcile this inherent conflict with the potential use of mass-surveillance capabilities for the purpose of targeting people with a warrant if all non-targeted data was filtered out.
Continuing...
Ok, so according to one of the world’s most prominent security experts we should:
1. Break up the NSA, end bulk-surveillance methods, and shift targeted surveillance missions and Cyberwarfare capabilities to US Cyber Command so that any stuxnet-like actions or things like hacking a Belgian Telephone Company are seen as offensive military actions.
2. Shift all domestic surveillance to the FBI, presumably a reference the FBI’s warrantless wiretapping program started by George W. Bush in the wake of 9/11. This reform has already kind of but not really happened but, in principle, it’s certainly a worthy goal of trying to redraw the line between domestic and foreign surveillance, although given that much of the uproar over NSA spying has to do with the fact that you can’t really disentangle foreign and domestic communications given how the internet is structured, it’s unclear how successful this will be.
3. Reprioritize the NSA so that, when the inevitable conflicts emerge in its mutually exclusive missions (securing networks while simultaneously trying to break them) the “securing the networks” priority wins. That’s, well, it’s ambitious. As the saying goes, “the best defense is a great offense”, but could the best defense actually be a great defense in the realm of national security when you have to not only protect your digital infrastructure but also spy on adversaries to learn about other stuff going on? That seems to be what Schneier is arguing.
So suggestion 2, getting the NSA out of domestic surveillance, seems pretty reasonable, albeit technically challenging. But what about suggestions 1 and 3. Should government hacking of other nations’ telecom firms, which is ubiquitous these days, be considered an act of war? Is that going to lead to a safer world? It will place a different context on spying that gets publicly outed, but is that a better context? And what happens if other nations don’t all agree to this new approach? Should their hacks of US firms now also be considered military actions too? Schneier isn’t clear on that, although he has called for some sort of globally run anti-surveillance enforcement agency:
Note that Scheier wrote that, “it is unclear how effective targeted surveillance against “enemy” countries really is. Even when we learn actual secrets, as we did regarding Syria’s use of chemical weapons earlier this year, we often can’t do anything with the information.” So it would appear, based on that statement, that Sheier is open to policies that effectively eliminate targeted surveillance in addition to bulk data-collection.
Skipping down...
There’s a lot to digest in that piece but note this part at the end:
Yes, in some senses surveillance abuses do share some of the challenges with nuclear, chemical, and biological weapon non-proliferation, small arms trafficking, human trafficking, etc. But isn’t surveillance, at least targeted surveillance, also part of the solution to nuclear, chemical, and biological weapon non-proliferation, small arms trafficking, human trafficking, etc?
So do we break up the NSA and place all spying under the auspices of the FBI, even spying on foreign leaders? Should we instead transfer all domestic spying to the FBI and then declare all foreign surveillance a military act and regulated by “a coalition of free-world nations dedicated to a secure global Internet”? Or will we end up attempting to legislate backdoors and legal access like the EU’s counter-terrorism chief is calling for?
These are just some of the issues swirling around the issue how to handle the roll out of ubiquitous, end-to-end strong-encryption. If you aren’t familiar with this emerging public debate yet, you will be eventually, because they aren’t going away any time soon. Untangling a global Mexican standoff that’s been going on since the dawn of civilization isn’t as easy as you might expect. It’s going to take a while.
Marks Ames has a recent piece that points us towards a rule in the 1986 Electronic Communications Privacy Act that is both surprising and not surprising: It’s surprising because, wow, it’s kind of amazing that the US government has the rights to read your emails over 180 days old without a warrant and yet this fun fact really hasn’t made it into the national discourse over the nearly two years since the Snowden Affair started.
At the same time, it shouldn’t really be surprising at since it’s been the law since 1986:
Well that was some fun history. And keep in mind that Patrick Leahy really is probably one of the most reliable defenders of civil liberties in the Senate so the 1986 Electronic Communications Privacy Act and 1994 Digital Telephony Bill would have probably both been a lot worse had it been someone else crafting the legislation and there. To put another way, the outcome of these bills has a lot more to do with the larger national security state and the immense influence it wields than any given Senator. It’s a harsh reality highlighted by the role played by both the ACLU and EFF in crafting and endorsing both bills. When two of the most prominent organizations associated with defending digital civil liberties turn out to have major corporate backers and helped write the laws that established warrantless access to old emails in 1986 and “let’s-just-wiretap-everyone” laws in 1994, it’s pretty clear that “digital due process” has been hasn’t been very likely to get the consideration its due for a long, long time.
Also keep in mind that, although accessing emails older than 180 days may not require a warrant, the emails need to at least be associated with some sort of investigation. But the bar is still a lot lower than a warrant:
Note that the coalition of technology companies mentioned in the article that are lobbying to change the law to require a search warrant is the same “Digital Due Process” organization Mark Ames discussed above that was started in 2010 by Jerry Berman (who helped write both the 1986 and 1994 laws). Also note that the “News” page for “Digital Due Process” hasn’t been update since April 2013, a month before the start of the Snowden Affair, which is kind of curious all things considered.
So it’s pretty clear that, while warrants aren’t currently necessary for rummaging through your old emails and phone records, that might change if the new “Email Privacy Act” becomes law. And yet, as Mark Ames pointed out above, while the new “Email Privacy Act” bill being debated in congress would indeed require a warrant for accessing old emails, it still leaves metadata open to warrantless access and the bill .
Now, putting aside the question of whether or not some government agencies should have warrantless access to metadata, it’s really quite remarkable that another industry-funded group led by the same man, Jerry Berman, is once again helping craft the laws that define our digital privacy rights. But it’s even more remarkable that this new “Email Privacy Act” is going to allow warrantless metadata access and there’s so little attention being paid to it, especially since the bill is endorsed by an array of industry-backed groups like the ACLU, EFF, and Digital Due Process. That seems like a big story!
This is part of the reason it’s unfortunate that the post-Snowden debates are taking place in a larger political environment where there most interested parties aren’t really interested in a debate. For the most part, we hear from:
1. National security hawks that view enhanced digital due process as a unaffordable luxury in an age of terrorist networks, rogues states and super weapons and see no reason for the public to be concerned about the potential abuses of vast and growing surveillance capabilities.
or
2. Libertarians and Cypherpunks that have already written off the possible of effective internal safeguards in government bureaucracies and see strongly encrypting everything as the only feasible solution to government surveillance abuses.
The much less tantalizing discussions about how to create rules that both security hawks and privacy activists can live with in the event that Congress doesn’t live up to either side’s dreams and writes compromise legislation (like the Email Privacy Act) haven’t really taken place in the national dialogue. For issues like metadata collection the privacy activist community has focused on the development of technological platforms like Tor that make metadata collection impossible and improving security protocols to prevent government hacking whereas the security hawks see the whole issue as trivial.
This dynamic of either dismissing privacy concerns outright or focusing on technology that blocks all surveillance completely, without much discussion of what parts of our digital selves should fall into the gray area of data that should be available, but only with a warrant or some other safeguard (and how to implement those safeguards effectively and transparently), is all part of the larger trend of the mainstreaming of Libertarian/pseudo-anarchist/anarcho-capitalist thinking that revolves around the abandonment of the idea that we can create a government by, of, and for the people that doesn’t proceed to abuse the people.
It’s an unfortunate situation because now we appear to be facing a rewrite of the digital privacy laws that don’t extend 4th amendment rights to metadata and the robust debate over how to handle exactly this kind of situation hasn’t really happened. Instead, we get something like “If the government will not be stewards of our rights, we can encode our rights into our system.”:
That’s right, Students for Liberty’s backers aren’t limited to the Koch brothers, Google, and other big corporate sponsors. The Students for Liberty’s board of advisors includes Prince Michael of Liechtenstein. But note that he’s #38 in the line of succession, so he’s an everyman prince, not some highfalutin prince.
Continuing...
“When McCobin’s group gave their award to Peter Thiel, their “west coast director” described Thiel on stage as a “personal role model of mine.””
Well, doesn’t “Students for Liberty” sound nice! Yes indeed:
Yikes. Well, in defense of Students for Liberty, they could be debatably worse!
Still, it’s pretty clear that when you ask the question “who does Students for Liberty fight for?” the answer is “the Koch brothers and other Libertarian oligarchs”. And that’s one of the nice things about a group like Students for Liberty: they’re pretty transparent. The corporate connections to organizations like the EFF and Digital Due Process, which are also heavily backed by Silicon Valley, aren’t nearly as obvious. And now we find ourselves in a situation where these industry-backed groups are pushing a major overhaul of digital privacy rules that it seems like they should be opposing based on their stated principles and goals.
It’s all a reminder that, even to this day in the US, the biggest organizations fighting to protect your digital data from Big Government were, themselves, organized and financed by Big Tech...Big Tech that is increasingly interwoven into the military industrial complex. Might there be a conflict of interest here? It seems possible.
Here’s another example of why future Supreme Court rulings on strong encryption and the 5th Amendment are going to be very closely watched cases:
More trouble for Uber around the corner in Canada? We’ll see...
Oh look, a consortium of 14 mega-banks have privately developed a special super-secure inter-bank messaging system that uses end-to-end strong encryption and permanently deletes data. It’s so super-secure
financial regulators are wonder if they’ll actually have access to the data:
Yes, the usual suspects for financial high crimes have a brand new messaging system with a fun “permanent deletion” feature and end-to-end encryption that presumably no one can break. What could possibly go wrong? Well, according to Symphony’s backers, nothing could go wrong because all the information that banks are required to retain for regulatory purposes are indeed retained in the system. Whether or not regulator’s can actually access that retained data, however, appears to be more of an open question:
“So, the questions remain, will Symphony be building in some sort of back-door access for regulators? Or will it just be storing that information required of regulators, but for its clients’ use?”
As we can see, many regulatory questions remain. So let’s hope that includes questions like, “If the banks have an unbreakable inter-bank messaging system that regulators can’t access, aren’t they going to be able to do exactly what they did with the massive ‘LIBOR’-rigging conspiracy, but with no electronic paper trail?” It’s an important question:
“Entry into the chat room was coveted by nonmembers interviewed by Bloomberg News, who said they saw it as a golden ticket because of the influence it exerted.”
So it sounds like a big question going forward is whether or not Symphony is going to double as a super-secure ‘golden ticket’ trading platform too. Hmmm...how’s that going to work out...
*gasp* You don’t say...:
Note that when you read:
that, yes, it’s certainly possible that reducing the costs of Bloomerberg’s messaging system could certainly be a factor in Wall Street’s decision to develop their own end-to-end encrypted messaging system that can delete data before the government can see it in addition to a desire to retain maximum control over their data. But also keep in mind the obvious: that the desire to maintain that control over data that regulators might be interested in reviewing is also all about the money:
As we can see, the new messaging system built by and for the industry with tens of billions of dollars in fines over the past five years and a proven track record of living by the “If you ain’t cheating, you ain’t trying”-philosophy will offer fun features like “real-time monitoring” of chat rooms — so it can presumably work with Wall Street’s new prole-precog systems that monitor the activities of employees and use artificial intelligence to sniff out wrongdoing from emails and chats (so regulators don’t have to be burdened with the task of regulating *wink*). Presumably we’re to assume that the banks’ compliance chiefs will actually end the illegal activities and not simply tell that employee to stop using language that sets off the AI and permanently delete those. Isn’t that helpful.
And then there’s this helpful control-oriented feature:
*******
‑Knock, Knock
-Who’s there?
-A regulatory agency that would like to see your traders’ messaging activity, but who is totally not interesting in investigating wrongdoing at your financial institution so don’t, like, delete anything or something like that.
-Oh, ok, let us get those messages for you. We have nothing to hide.
-Thanks. Hey, why are so many of these messages deleted? Oh well, it looks like there definitely won’t be an investigation now.
-Oh dear, we’re really sorry to hear that. LOL!
***
Worst. Joke. Ever.
Ok, that’s not true. Jokes can get far worse.
Mark Ames has a new piece on the the federal bribery investigation involving Ron Paul’s 2012 campaign that’s threatening to implode Rand Paul’s flailing 2016 presidential ambitions: Two of Rand Paul’s top aides, Jesse Benton and John Tate, recently plead not guilty to bribing the influential former Iowa state senator Kent Sorenson GOP. So we’ll see how that investigation goes, but interestingly, the Paul team’s defense is getting some help from a rather unexpected source: Google. When federal investigators issued a warrant for Jesse Benton’s gmail account last year, Google notified Benton of the warrant, Benton’s lawyers appealed it, and Google has refused to turn the emails over until a court resolves the issue. This is Google’s standard practice so that, in and of itself, is not exactly suspicious. But as Ames points out, we are getting into rather interesting territory here since Google has been a major donor to both Ron and Rand Paul:
“The next day, December 29, Benton & team had Sen. Sorenson issue a defiant statement that basically said, “You think I get paid for my principles? Wait till you see the FEC filings, then you’ll see that Bachmann is a liar and no one’s paying me anything, by gum!” And then Sorenson and the Paul capos proceeded to forge their FEC filings to funnel their payments to Sorenson through a pair of dummy front companies. Not exactly the sharpest conmen, but brains aren’t much of a requirement for success as a con artist. An empty conscience, some cunning, and the stupid sense that you and your testicles are smarter than everyone else—those are much more important qualities.”
Well that explains a lot. And it also raises the question of just what the Paul team’s testicles are recommending at this point. Hmmm....how about trying to turn this investigation into a rallying cry of government overreach:
With a court recently ruling that the FBI could indeed search Benton’s gmail account, and Google continue to refuse access as a showcase of their dedication to their users’ privacy, it’s sure looking like we could see Google and Benton join hands in trying to spin this into a case of the government “trampling” Benton’s rights. And as Hanni Fakhoury with the Electronic Frontier Foundation points out, courts have not yet settled the question of how specific or broad email search warrants should be, and this case is one of the most prominent illustrations of how users can fight back:
A half a million emails is quite a large number, but then again, this is the email account of one of the key staffers for a presidential campaign. Plus, the emails are from 2011–2014, which make sense for an investigation for a bribery scandal involving a 2012 presidential campaign although it’s not unimaginable that a case could be made for narrowing that time-frame.
It all raises a gut-wrenching possibility: On the one hand, if the FBI’s warrant really was overly broad and subsequent rulings agree that it was overly broad, Jesse Benton and the Pauls sort of get to claim victory and possibly kill the investigation, although its possible that the FBI could still narrow the search warrant and get the evidence it needs. On the other hand, if it turns out the FBI’s warrant really was overly broad but subsequent court rulings find otherwise, the case could set a precedent basically gives the government access to your entire email history for all sorts of other criminal cases that don’t involve sleazy pols bribing each other when even when that full email history isn’t remotely needed or relevant. So we really have to hope that the FBI was being aggressive (because these were sleazy pols bribing each other which is disgusting) but not too aggressive, because otherwise the Paul clan’s bribery scandal ends up becoming a case of ‘the little guy vs the Big Bad Government’ regardless of the outcome.
It’s also a reminder that the array of new questions related to the 4th and 5th amendments and privacy rights aren’t limited to topics like whether or not Apple or Google can make smartphones with unbreakable encryption and under what conditions should someone be forced to hand over their password. In this case, the power to turn over that information isn’t in the hands of Benton or some unbreakable-encryption smartphone user that’s the only person with the password. It’s in the hands of Google, and now Google appears to be willing to defy court orders and risk contempt of court charges, possibly as some sort of corporate branding scheme. And until we get some sort of resolution on the case, it leaves open a number of legal questions over what happens if incriminating evidence is co-mingled with a massive amounts of personal data that almost assuredly has nothing to do with any investigation.
So if you’re planning on bribing some politicians, keep in mine that it’s not yet clear whether or not you should be using an unbreakable super-encryption phone, where even Google or Apple can’t access the content and you might be able to plead the 5th Amendment, or just stick with Google services like gmail where, in the event of a warrant, Google’s corporate legal team suddenly becomes your legal team. Choosing the right smartphone package for your campaign’s political operatives just got a lot more complicated.
It looks like Rand Paul’s campaign aid, Jesse Benton, who was under FBI investigation for bribing Iowa state senator Ken Sorenson in 2012 as part of Ron Paul’s presidential campaign, is out of options. A US District Judge just ruled that Google must turn over the emails and if Benton wants to appeal he can do it later. And Google is agree to comply with the ruling and turn over the emails:
Well, it’ll be interesting to see what they find.
It’ll also be interesting to see if we don’t start seeing some of the contents of those emails showing up in the media as a result of leaks by Benton himself while this entire investigation is ongoing. Why? Because that’s sort of what they were planning on doing to the bribed Iowa state senator, Ken Sorenson, if he didn’t agree to accept the bribe. And it’s something government prosecutors are specifically worried about in their current investigation:
Yes, Benton was threatening to blackmail the guy he was bribing, but according to his lawyer it was just because Benton was was concerned that Sorenson was trying to blackmail Ron Paul’s campaign:
They probably weren’t going to leak that email.
It looks like Symphony, the new strongly-encrypted messaging system made by and for Wall Street’s ‘usual suspects’, has a strategy for assuaging its critics: turn this into a fight about Big Government and the invasion of privacy while touting how it will keep all those text messages nice and safe from hackers so regulators can access them when they request them:
Yes, the CEO of Symphony really made this argument:
So people are concerned about the banks they perceive as ‘bad actors’ using a systems that requires trust in those bad actors because it’s the bad actors, and not Symphony, that control access to the encypted messages, and the argument by Symphony is not to worry because the bad actors, that happen to be investors Symphony, don’t actually have any regulatory oversight over the company. Well that sure makes all those concerns just melt away!
And, of course, as an anonymous banker involved in the project points out, “It’s not Symphony’s responsibility to make the data available — it’s the bank’s responsibility,” :
And that’s all part of why it’s going to be very interesting to see how much Wall Street attempts to turn this into a ‘private sectors vs Big Brother’ policy debate. Because if Wall Street’s ‘bad actors’ are the keepers of the keys, the best way to generate public support for that system is to make the entities that might want those keys (the government and hackers) seem even worse:
“If federal regulations were to require Symphony to change its encryption policy and store keys so that it could provide government agencies with access to messages, those rules would logically also apply to other messaging applications that use encryption, including WhatsApp, Facebook and Apple iMessage, Gurle said.”
Wall Street’s love-hate relationship with the Cypherpunk revolution is about to get a lot more loving.
Symphony, Wall Street’s fancy new bank-to-bank messaging system that sports super-encryption even the government can’t break, just went live. And they did so only after coming to an agreement with the New York Department of Financial Services over concerns that Symphony’s clients were going to be hiding incriminating evidence from regulators: Symphony agrees to keep copies of client messages for seven years. Additionally, for four banks — Goldman Sachs, Deutsche Bank, Credit Suisse and Bank of New York Mellon — that are both investors in Symphony and, in most cases, perpetrators of the giant Libor-rigging cartel arranged via a chat system, they also have to give copies of their encryption keys to an independent custodian.
So with those safeguards in place, Wall Street’s controlled information black hole is now a reality:
Note that, while this is certainly a major victory for the Symphony, it’s also is just the New York state financial regulator that gave Symphony the green light. Federal and international regulators have yet to weight in:
Also note that, while four of Symphony’s investors have agreed to comply with additional rules mandating that copies of their encryption keys be kept with an independent entity, that still leaves ten other large financial entities (Bank of America, BlackRock, Citadel, Citigroup, Jefferies Group LLC, JPMorgan Chase & Co, Maverick Capital Ltd, Morgan Stanley, Nomura Holdings Inc and Wells Fargo & Co), many with highly questionable regulatory track records, that are presumably going to be using Symphony too, just not under the regulatory authority of the DFS.
Will those other large firms also agree to additional scrutiny and oversight as Symphony’s debut gets underway? We’ll have to wait and see, but it’s worth noting that DFS “believes that the requirements included in today’s agreements should apply to all regulated financial institutions using Symphony in the future”:
So it’s unclear whether or not those 10 other Symphony investors will comply with the additional encryption-key rules, but DFS certainly thinks they should and should also apply to all financials institutions using Symphony in the future. As Symphony ushers in the era of black hole digital record keeping for Wall Street this week, that gap between what regulators see as necessary and what Wall Street is actually doing is something worth keeping in mind.
If you’re very patient and a fan of Tor but find it lacking in cryptographically-provable anonymity, it’s good you’re patient, because a much slower, but more secure, version of Tor is coming at some point:
And in case you were curious if Dissent, like Tor, is funded by DARPA....and...*drum roll*...let’s take a look at the Dissent project acknowledgements:
Just FYI, if you can say you’ve never been a victim of some sort of ‘ransomware’ attack that’s great, but it’s going to be a lot harder to say that in the future once the ransomware bonanza begins now that it’s known that a single entity may be behind the massive $325 million Cryptowall 3.0 ransomware racket:
“This week the FBI shocked no one in the security industry by recommending businesses just pay the criminals.”
That’s the bad ransomware news this week. But there has been some recent good news. If you’re a victim the ‘CoinVault’ ransomware scam and you haven’t yet paid the ransom but would still like to decrypt your files, there’s a new free tool you should learn about:
“The announcement that all CoinVault encryption keys have been obtained comes after last month Dutch police reported arresting two individuals suspected of using this piece of ransomware to infect computers around the world.”
Yep, it appears that it was just two Dutch hackers behind the global CoinVault scam. Oops. But if they hadn’t, they would presumably still be out there ransoming people’s data.
It’s all a reminder that the global nature if the internet, while awesome in many ways, also creates a rather tempting target for digital criminals since their potential list of crime victims now includes everyone on the planet with an internet connection.
It’s also a reminder to not get to ‘click-happy’ with your email attachments.
Digital communications technology, like all technology, is a double-edged sword. But here’s a reminder that digital communication technologies in particular is, somewhat ironically, a double-edged sword that society has a hard time actually talking about:
“I don’t want a back door. ... I would like people to comply with court orders, and that is the conversation I am trying to have.”
Keep in mind that FBI director Comey says:
it’s basically a nonsense statement since the ability of companies to comply with those court orders would require the companies themselves to have a ‘back door’, which clearly isn’t the case when dealing with the types of phones used by the San Bernadino terrorists. But also keep in mind that Comey is using the same argument frequently used by privacy activists in defense of strong encryption without ‘back doors’ who proclaim that authorities should just get a warrant. It’s a reminder that there’s a LONG way to go before society arrives at some sort of consensus, even a temporary consensus, regarding the proper balance to strike with these types of technologies since the debate being fed to the public at large on all sides is still largely incoherent.
With that in mind, here’s an article about a recent Harvard study that found that law enforcement’s claims that strong encryption is hindering their investigations are wildly overblown. While that might sound like the kind of findings that would please privacy activist, they probably won’t be super enthusiastic about the rest of the study’s findings:
“The report conceded that the increased availability of encryption products impedes government surveillance under certain circumstances. But it also concluded that the burgeoning market for Internet-connected devices will “likely fill some of these gaps and...ensure that the government will gain new opportunities to gather critical information from surveillance.””
That’s right, according to the study, the Internet of Things is going to fill our world with so many networked devices filled with all sorts of sensors that we basically don’t need to worry about strong encryption blocking investigations because there’s going to be so many surveillance alternatives that could also be used to spy on us beyond our personal computers and smartphones. Law enforcement is simply going to start turning the Internet of Things into a new spy network:
Now, it’s unclear if being able to turn internet connected refrigerators into a wifi spying devices would be of any help at all in the San Bernadino investigation. But it does highlight one of the disturbing aspects of the digital future: concerns over your personal digital privacy are going to be a lot harder to adequately address when we’re all immersed in a sea of potential spy devices. And if law enforcement can use this growing infrastructure, you can bet the personal data collection industries are going to be using those capabilities too.
So the overall message of the study appears to be “strong encrypt without back doors all you want, you’re still going to be highly surveillable because there’s no possible way all these new devices are also going to strongly encrypted and unhackable too.” And that’s probably going to be the case for a large number of internet connected devices unless consumers are willing to pay extra to buy the hack-proof, strongly encrypted internet connected toaster.
Still, that’s just the status of things today. As homes become “smart homes” and all our devices start getting hooked up together in one big network, your hackable toaster could end up being a gateway into the rest of your devices and be an avenue for serious damage (imagine devices that could be hacked to overheat and start a fire or something). So who knows, following a few Internet of Things mega-hacks that cause widespread physical damage, turning internet connected devices into untamperable and unhackable black boxes might be standard operating procedure. And given the way technology seems to develop, super-damaging Internet of Things mega-hacks seem like one of those events that’s basically an inevitability and probably a future New Normal.
So enjoy the temporary quasi-mootness of the debates over strong encryption and ‘back doors’ while we enjoy the explosion of the Internet of Things. Let’s just hope that process involves mostly just spying and doesn’t include too many actual explosions.
Reminiscent of Uber’s previous attempt to remotely encrypt corporate data before Canadian investigators to seize it, Uber just gave us another example of how encryption is going to help the companies that behave like Uber unfortunately continue behaving like Uber.
How so? Well, in this case, Uber hired a private intelligence firm, Ergo, to investigate Andrew Schmidt, a labor lawyer suing them for anti-trust violations. And, as Uber is prone to do, the investigation was so over-the-top and bordering on fraud that a judge ruled that Uber has to turn over its communication with Ergo to the plaintiffs.
But here’s the catch: those communications almost all took place via Wickr, an encrypted chat app that auto-deletes messages after a set period. So now there’s no way to establish whether or not Uber approved of Ergo’s illegal tactics. Oh well:
“The implications go far beyond a single case. Uber is currently litigating 70 different federal lawsuits, which range from accusations of wage theft to fundamental questions of worker classification. Any one of those cases could be a tempting target for third-party research firms like Ergo. According to a sworn deposition from an Ergo employee, this was the fourth time Uber hired the company for research, although it’s unclear whether the other cases involved an active trial. Given the volume of cases against Uber and the routine way in which the investigation was assigned, it’s plausible the company was contracting with other research firms.”
Yeah, we probably shouldn’t be super shocking if it turns out that Uber’s been hiring Ergo to fraudulent dig up dirt on Uber’s many plaintiffs. Of course, given the apparent secrecy that’s involved in Uber’s communications with Ergo, we also shouldn’t be super shocked if we never find out about the targets of those other investigations or at least the content of what they investigated. Thanks to fun encrypted chat apps like Wickr that allowed Uber and Ergo to “avoid potential discovery issues”:
“Presented with a court-mandated discovery order, Uber provided decrypted versions of the PGP emails, but the Wickr conversations have proven to be more of a challenge. Although email records show Henley exchanging Wickr screen names with Ergo executives, Henley denied directly communicating over the service in a sworn deposition. Given Wickr’s automatic deletion system, that claim is impossible to disprove.”
Self-deleting encrypted messaging services for corporate communications. That sure is conventient! And almost certainly the future. At least the future of corporate communications involving content that could raise “potential discovery issues”.
We’ll see if Uber can use encryption to dodge another legal bullet. Either way, it seems like a given that corporate investigations are going to be less and less feasible as corporations learn more about all the great legal features that come with systems like Wickr and make a habit of using them.
But at least now we know that if you file a lawsuit against Uber, you probably want to have a chat with your friends and colleagues about what they should say when they suddenly get random inquiries about you ‘for a project profiling up-and-coming lawyers in the US’. Having your friends and colleagues inform the mystery caller about the judge’s findings in this current case against Uber and the legal implications of fraudulently investigating a plaintiff is one possible approach. There are others...
Here’s a new technological twist to the 5th Amendment conundrums raised by ubiquitous unbreakable encryption technology on ubiquitous personal information gathering devices (smartphones): a federal judge reportedly issued a secret order to a defendent accused of prostituting underage girls against their will to unlock his iPhone using his fingerprint. While the Supreme Court has yet to clarify the 5th amendment issues associated with ordering defendants to unlock devices using some sort of biometric method, that’s still assumed to be more likely to be constitutional than ordering someone to give their passcode (the “strong box with a key” vs “wall safe with a combination lock” legal scenarios). So this order may or may not be constitutional. A Super Court ruling is going to be required to settle the issue.
But in this case, ordering the defendant to use his fingerprint to unlock the phone didn’t end up unlocking the phone. Why? Because iPhones set up to use fingerprint scans instead of a password automatically require a password if the phone hadn’t been unlocked for at least 48 hours. At that point, the phone effectively has a “strong box” and much more constitutionally protected “wall safe” protecting its contents.
So while these secret court orders to use fingerprints to unlock a smartphone phone are relatively rare at this point, we probably shouldn’t be surprised if there’s a flurry of similar new court orders now that it’s clear that the “strong boxed” smartphones just might gain a “wall safe” in 48 hours or less:
“Even so, the government’s efforts in this instance were not successful, according to court documents. The authorities were unable to access the phone’s contents. The reason is most likely because, if a iPhone that has been fingerprint enabled has not been used for at least 48 hours, both the password and fingerprint are required to unlock it.”
Part of what makes this technology that adds “wall safe” password only after an user-set period of time has passed is that it creates a situation where authorities could reasonable wonder, at the moment of arrest, just how much time is left before the 48 hours expires and the phone gets extra constitutional protection. Is there 48 hours left before the passcode requirement kicks in or 48 seconds? Unless you just saw the suspect talking on the phone that’s an open question. So with cases like this taking place while the “strongbox vs wallsafe” 5th Amendment issue is still heading towards the Supreme Court, it will be interesting to see if this “wall safe on a timer” technology ends up making it constitutionally easier for authorities to demand that suspects with fingerprint-protected iPhones immediately unlock their phones before the passcode requirement gets activated. That would be a bit ironic.