Imagine that you want to tell someone a secret. You put it in a message addressed to only that person. The message travels across a series of crowded public thoroughfares, where, at times, it will be out in the open, entirely visible. It can be intercepted, even duplicated, along the way, at various points, by different parties—unsavory people, government agents, or both.
That’s essentially what happens any time data is sent across the Internet, particularly over open, public networks. So how do you keep your secret? With a code: when data is encrypted, it appears, to anyone without the key to decrypt it, as an unreadable tangle of bits. Though encryption sounds like an activity practiced solely by the utterly paranoid, it’s now extremely common: Google, Microsoft, Facebook, Twitter, banks, and online stores regularly encrypt data, both on their servers and in communications with their users, as does any technology company with a concern for information security.
The recent revelations about the mass surveillance of digital communications by the United States have brought fresh attention to the problem of encryption. According to the Times, the National Security Agency “sifts through the contents of what is apparently most e-mails and other text-based communications that cross the border” to look for search terms about identified targets. The N.S.A. has since revealed that it “touches” roughly 1.6 per cent of all Internet traffic per day, which is more than Google. In a 2008 slide presentation about its Xkeyscore surveillance program, the agency described “using encryption” as an “anomalous event”—cause for further scrutiny—on par with “searching the web for suspicious stuff.” According to the Guardian, the agency’s policies also allow it to retain domestic communications if they are encrypted. In response, Lavabit, the secure e-mail provider that Edward Snowden uses, went dark. They’d rather not “become complicit in crimes against the American people,” its founder said. Shortly afterward, Silent Circle, which provides a range of secure communications services, announced that it, too, would shut down its e-mail program.
But despite the N.S.A.’s outmoded suspicion of people who encrypt their data, strong encryption schemes are now pervasive, and are essentially automatic in certain contexts, like the transmission of a username and password. Even exceptionally common keys remain largely uncrackable, including by government agencies—though they appear to be gaining some ground. One of the theories about the N.S.A.’s gargantuan new data center in Bluffdale, Utah, is that it has a role in the agency’s work in cracking currently unbreakable systems.
A starting point for the spread of digital encryption is a 1976 paper called “New Directions in Cryptography,” by Whitfield Diffie and Martin Hellman. (“We stand today on the brink of a revolution in cryptography,” the writers state at the beginning.) The paper publicly introduced what has become known as the Diffie-Hellman key agreement, a method that allows two users to share a secret key over a public channel without any prior communication. Encoding a secret inherently involves at least two secrets, since the code used to protect the original data must itself remain hidden—otherwise, anybody could use the code to read it. It’s “one of the fundamental problems of cryptography,” said Matt Blaze, a well-known cryptography researcher and the director of the University of Pennsylvania’s Distributed Systems Lab.
Before the development of computers, virtually all encoding systems relied on a symmetric key structure, in which both the sender and the receiver knew the exact same code—a point of vulnerability, because the code had to be previously, and securely, agreed upon. But in an asymmetric encryption scheme like the one described by Diffie and Hellman, for instance, a user has two keys: a public one that is shared openly to encrypt the message, and a private one that is used to decrypt it. As Peter Maass reported in the Times magazine, this is how Edward Snowden first contacted the documentarian Laura Poitras:
This past January, Laura Poitras received a curious e-mail from an anonymous stranger requesting her public encryption key… She replied to this one and sent her public key—allowing him or her to send an encrypted e-mail that only Poitras could open, with her private key.
When properly implemented, modern cryptographic techniques can make three implicit promises: that encrypted data unquestionably came from a particular sender; that only the intended recipient can read it; and that the data is unaltered. Encryption, ultimately, “raises the cost of surveillance” and makes “dragnet surveillance impossible,” said Chris Soghoian, the A.C.L.U.’s principal technologist. With widespread encryption, surveillance would require the government to individually select a target and compromise his or her computer, either physically or through software.
In his book “Crypto,” the technology writer Steven Levy details the government’s attempts to limit the spread of strong encryption during the so-called Crypto Wars of the nineteen-nineties; encrypted communications cannot be read by the Feds, after all, whether they are produced by upstanding American citizens or agents of an enemy state, terrorists, or pedophiles. Paul Rosenzweig, a former Deputy Assistant Secretary for Policy in the Department of Homeland Security and currently a visiting fellow at the Heritage Foundation, said that the efforts failed because, “unlike a piece of hardware you can control from export,” unbreakable encryption is an idea, and the government “can’t stop you from taking an idea out of the United States.” The result, he said, is that “I have now, on my computer, encryption that is, for all intents and purposes, uncrackable by the F.B.I.”
Strong encryption, however, remains difficult for the inexperienced to implement: Glenn Greenwald ignored Snowden’s initial e-mails directing him to use encrypted communications and providing him with instructions on how to do it. “It’s really annoying and complicated, the encryption software,” he told the Times.
E-mail poses a particular usability and technical challenge. Setting up e-mail with P.G.P. (“Pretty Good Privacy”), an encryption program, for instance, is more hassle than most users seem willing to deal with, as Greenwald demonstrated. E-mail “sounds like the simplest application of cryptography imaginable,” said Blaze. “I’m trying to send you a message that you can only read, and that you know definitely came from me, and we have algorithms and protocols that do exactly that. But we still haven’t figured out the basic technical ways to implement them in practice.” A common solution to one of encrypted e-mail’s core usability problems is to have the service provider hold the decryption keys. Silent Circle took this approach—as most e-mail providers, from Google to Microsoft, do—with its service Silent Mail, largely because P.G.P., developed in 1991 by Silent Circle’s founder, Phil Zimmermann, “doesn’t run on an iPhone,” Zimmermann explained.
The problem is that e-mails cannot be encrypted on a mobile device before they’re sent to Silent Circle’s servers, and the company could potentially be pressured to turn over the decryption keys to the government, thus granting it access to users’ e-mails. This happened, in 2007, to Hushmail, a provider of secure e-mail services that ultimately turned over CDs of users’ unencrypted e-mail. (Zimmermann no doubt remembers the incident well; he was on Hushmail’s advisory board.) The closure of Lavabit “reminded us of why we wanted to get rid of” Silent Mail, Zimmermann said.
In 1994, Congress passed the Communications Assistance for Law Enforcement Act, which compels telecommunications providers to reconfigure their technology so that the government can wiretap digital messages and survey them in real time. It is surprisingly friendly to encryption, however, as Soghoian pointed out: the law states that a “telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication.”
Silent Circle’s other products, Silent Phone and Silent Text, which the company continues to offer, are built with precisely this exemption in mind—so the company can avoid ever being compelled to turn over real, decrypted user data to the government. The products utilize Z.R.T.P., a protocol developed by Zimmermann “specifically to not trust the service provider, because I didn’t feel that trusting the phone company was a good idea,” he said. None of the encryption keys—which are generated by a Diffie-Hellman exchange and exchanged securely between the users—are shared with a provider, so it cannot decrypt the content of the messages, even if the government attempted to force it to.
“We don’t have the content, we don’t have the keys to decrypt the content, and we don’t have records of who’s calling who,” said Zimmermann. “The best they could get would be the encrypted media.” If the government could somehow determine who the users were, it could pressure the recipient or sender to decrypt the messages, explained Amie Stepanovich, the director of the Electronic Privacy Information Center’s Domestic Surveillance Project, because, in the U.S., many states only require one-party consent. But legally forcing a user to decrypt their files poses a Fifth Amendment issue that has yet to be resolved; some argue that it would violate one’s right against self-incrimination. In a recent child-pornography case, a federal judge halted an order that would have required the defendant to decrypt his hard drives, though a final ruling has yet to be made on the matter; the F.B.I. just announced that it had cracked the encryption on two of the nine hard drives it possesses, but it did not elaborate on the encryption scheme used on the drives or its method of decryption.
Generally speaking, though, the growth of practically uncrackable encryption has, until recently, posed few challenges for law enforcement. In its 2008 Wiretap Reports to Congress, the Administrative Office of the U.S. Courts noted that encryption was encountered in only two state wiretaps, and that “neither instance prevented officials from obtaining the plain text of the communications.” But the growth of user-friendly, truly secure encryption services may be changing that. In its most recent report, the A.O. noted fifteen encounters with encryption in the course of a wiretap; it was unable to decipher the text in four instances, noting that it was the first time “that jurisdictions have reported that encryption prevented officials from obtaining the plain text of the communications since the A.O. began collecting encryption data in 2001.”
In response, in 2010, the F.B.I. began agitating for a new law that would require companies to decrypt any and all user data in response to wiretap requests. It, or a similarly constructed law, would effectively outlaw a security architecture like Silent Circle’s. Zimmermann, in a strained voice, said that if such regulations came to pass, “I wouldn’t allow it. I’ve worked my whole career on this and on these principles. I absolutely would not allow it.”
A modified form of the F.B.I.’s original proposal, known informally as CALEA II, has gained traction in recent months. It encourages technology companies to build back doors into their services to make them easier to wiretap in real time, though it would not force companies to hold keys for decrypting user content. It remains controversial. Blaze, who, along with twenty other computer scientists, signed a report criticizing the F.B.I.’s plan, said that “if they get what they’re asking for, it’ll have two bad outcomes: it won’t work very well, and all the services with a back door will be inherently more vulnerable to compromise.” The result, emphasized Blaze, is that the “crime-solving mandate is going to end up having a crime-facilitating effect.”
From a policy perspective, the best defense of encryption is China. The government needs to protect itself against Chinese hacking, and it recognizes that corporations, from utilities to financial companies, need to as well. Given those concerns, Soghoian believes that “it would be difficult for the government to engage in a face-on attack on cryptography itself.” The trade-off is that a system that’s “secure against China is secure against the F.B.I.” Zimmermann and others are counting on the government coming to terms with that bargain. “Part of our strategy from the beginning has been that we want to bring about conditions where we have a lot of government customers”—Silent Circle is used by Navy SEALs, Canadian Special Forces, and others—“and, thus, create collateral damage if the government tries to push us to put back doors in. They would be hurting themselves.”
Even ubiquitous, strong encryption would not leave the government without options when it truly needs to collect intelligence on a given target. “All cryptography ultimately depends on a secret that has to be somewhere, on some computer,” Blaze said. And sitting at that computer is a person, with his or her own vulnerabilities. There are still ways to get the secrets.
Original Article
Source: newyorker.com
Author: Matt Buchanan
That’s essentially what happens any time data is sent across the Internet, particularly over open, public networks. So how do you keep your secret? With a code: when data is encrypted, it appears, to anyone without the key to decrypt it, as an unreadable tangle of bits. Though encryption sounds like an activity practiced solely by the utterly paranoid, it’s now extremely common: Google, Microsoft, Facebook, Twitter, banks, and online stores regularly encrypt data, both on their servers and in communications with their users, as does any technology company with a concern for information security.
The recent revelations about the mass surveillance of digital communications by the United States have brought fresh attention to the problem of encryption. According to the Times, the National Security Agency “sifts through the contents of what is apparently most e-mails and other text-based communications that cross the border” to look for search terms about identified targets. The N.S.A. has since revealed that it “touches” roughly 1.6 per cent of all Internet traffic per day, which is more than Google. In a 2008 slide presentation about its Xkeyscore surveillance program, the agency described “using encryption” as an “anomalous event”—cause for further scrutiny—on par with “searching the web for suspicious stuff.” According to the Guardian, the agency’s policies also allow it to retain domestic communications if they are encrypted. In response, Lavabit, the secure e-mail provider that Edward Snowden uses, went dark. They’d rather not “become complicit in crimes against the American people,” its founder said. Shortly afterward, Silent Circle, which provides a range of secure communications services, announced that it, too, would shut down its e-mail program.
But despite the N.S.A.’s outmoded suspicion of people who encrypt their data, strong encryption schemes are now pervasive, and are essentially automatic in certain contexts, like the transmission of a username and password. Even exceptionally common keys remain largely uncrackable, including by government agencies—though they appear to be gaining some ground. One of the theories about the N.S.A.’s gargantuan new data center in Bluffdale, Utah, is that it has a role in the agency’s work in cracking currently unbreakable systems.
A starting point for the spread of digital encryption is a 1976 paper called “New Directions in Cryptography,” by Whitfield Diffie and Martin Hellman. (“We stand today on the brink of a revolution in cryptography,” the writers state at the beginning.) The paper publicly introduced what has become known as the Diffie-Hellman key agreement, a method that allows two users to share a secret key over a public channel without any prior communication. Encoding a secret inherently involves at least two secrets, since the code used to protect the original data must itself remain hidden—otherwise, anybody could use the code to read it. It’s “one of the fundamental problems of cryptography,” said Matt Blaze, a well-known cryptography researcher and the director of the University of Pennsylvania’s Distributed Systems Lab.
Before the development of computers, virtually all encoding systems relied on a symmetric key structure, in which both the sender and the receiver knew the exact same code—a point of vulnerability, because the code had to be previously, and securely, agreed upon. But in an asymmetric encryption scheme like the one described by Diffie and Hellman, for instance, a user has two keys: a public one that is shared openly to encrypt the message, and a private one that is used to decrypt it. As Peter Maass reported in the Times magazine, this is how Edward Snowden first contacted the documentarian Laura Poitras:
This past January, Laura Poitras received a curious e-mail from an anonymous stranger requesting her public encryption key… She replied to this one and sent her public key—allowing him or her to send an encrypted e-mail that only Poitras could open, with her private key.
When properly implemented, modern cryptographic techniques can make three implicit promises: that encrypted data unquestionably came from a particular sender; that only the intended recipient can read it; and that the data is unaltered. Encryption, ultimately, “raises the cost of surveillance” and makes “dragnet surveillance impossible,” said Chris Soghoian, the A.C.L.U.’s principal technologist. With widespread encryption, surveillance would require the government to individually select a target and compromise his or her computer, either physically or through software.
In his book “Crypto,” the technology writer Steven Levy details the government’s attempts to limit the spread of strong encryption during the so-called Crypto Wars of the nineteen-nineties; encrypted communications cannot be read by the Feds, after all, whether they are produced by upstanding American citizens or agents of an enemy state, terrorists, or pedophiles. Paul Rosenzweig, a former Deputy Assistant Secretary for Policy in the Department of Homeland Security and currently a visiting fellow at the Heritage Foundation, said that the efforts failed because, “unlike a piece of hardware you can control from export,” unbreakable encryption is an idea, and the government “can’t stop you from taking an idea out of the United States.” The result, he said, is that “I have now, on my computer, encryption that is, for all intents and purposes, uncrackable by the F.B.I.”
Strong encryption, however, remains difficult for the inexperienced to implement: Glenn Greenwald ignored Snowden’s initial e-mails directing him to use encrypted communications and providing him with instructions on how to do it. “It’s really annoying and complicated, the encryption software,” he told the Times.
E-mail poses a particular usability and technical challenge. Setting up e-mail with P.G.P. (“Pretty Good Privacy”), an encryption program, for instance, is more hassle than most users seem willing to deal with, as Greenwald demonstrated. E-mail “sounds like the simplest application of cryptography imaginable,” said Blaze. “I’m trying to send you a message that you can only read, and that you know definitely came from me, and we have algorithms and protocols that do exactly that. But we still haven’t figured out the basic technical ways to implement them in practice.” A common solution to one of encrypted e-mail’s core usability problems is to have the service provider hold the decryption keys. Silent Circle took this approach—as most e-mail providers, from Google to Microsoft, do—with its service Silent Mail, largely because P.G.P., developed in 1991 by Silent Circle’s founder, Phil Zimmermann, “doesn’t run on an iPhone,” Zimmermann explained.
The problem is that e-mails cannot be encrypted on a mobile device before they’re sent to Silent Circle’s servers, and the company could potentially be pressured to turn over the decryption keys to the government, thus granting it access to users’ e-mails. This happened, in 2007, to Hushmail, a provider of secure e-mail services that ultimately turned over CDs of users’ unencrypted e-mail. (Zimmermann no doubt remembers the incident well; he was on Hushmail’s advisory board.) The closure of Lavabit “reminded us of why we wanted to get rid of” Silent Mail, Zimmermann said.
In 1994, Congress passed the Communications Assistance for Law Enforcement Act, which compels telecommunications providers to reconfigure their technology so that the government can wiretap digital messages and survey them in real time. It is surprisingly friendly to encryption, however, as Soghoian pointed out: the law states that a “telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication.”
Silent Circle’s other products, Silent Phone and Silent Text, which the company continues to offer, are built with precisely this exemption in mind—so the company can avoid ever being compelled to turn over real, decrypted user data to the government. The products utilize Z.R.T.P., a protocol developed by Zimmermann “specifically to not trust the service provider, because I didn’t feel that trusting the phone company was a good idea,” he said. None of the encryption keys—which are generated by a Diffie-Hellman exchange and exchanged securely between the users—are shared with a provider, so it cannot decrypt the content of the messages, even if the government attempted to force it to.
“We don’t have the content, we don’t have the keys to decrypt the content, and we don’t have records of who’s calling who,” said Zimmermann. “The best they could get would be the encrypted media.” If the government could somehow determine who the users were, it could pressure the recipient or sender to decrypt the messages, explained Amie Stepanovich, the director of the Electronic Privacy Information Center’s Domestic Surveillance Project, because, in the U.S., many states only require one-party consent. But legally forcing a user to decrypt their files poses a Fifth Amendment issue that has yet to be resolved; some argue that it would violate one’s right against self-incrimination. In a recent child-pornography case, a federal judge halted an order that would have required the defendant to decrypt his hard drives, though a final ruling has yet to be made on the matter; the F.B.I. just announced that it had cracked the encryption on two of the nine hard drives it possesses, but it did not elaborate on the encryption scheme used on the drives or its method of decryption.
Generally speaking, though, the growth of practically uncrackable encryption has, until recently, posed few challenges for law enforcement. In its 2008 Wiretap Reports to Congress, the Administrative Office of the U.S. Courts noted that encryption was encountered in only two state wiretaps, and that “neither instance prevented officials from obtaining the plain text of the communications.” But the growth of user-friendly, truly secure encryption services may be changing that. In its most recent report, the A.O. noted fifteen encounters with encryption in the course of a wiretap; it was unable to decipher the text in four instances, noting that it was the first time “that jurisdictions have reported that encryption prevented officials from obtaining the plain text of the communications since the A.O. began collecting encryption data in 2001.”
In response, in 2010, the F.B.I. began agitating for a new law that would require companies to decrypt any and all user data in response to wiretap requests. It, or a similarly constructed law, would effectively outlaw a security architecture like Silent Circle’s. Zimmermann, in a strained voice, said that if such regulations came to pass, “I wouldn’t allow it. I’ve worked my whole career on this and on these principles. I absolutely would not allow it.”
A modified form of the F.B.I.’s original proposal, known informally as CALEA II, has gained traction in recent months. It encourages technology companies to build back doors into their services to make them easier to wiretap in real time, though it would not force companies to hold keys for decrypting user content. It remains controversial. Blaze, who, along with twenty other computer scientists, signed a report criticizing the F.B.I.’s plan, said that “if they get what they’re asking for, it’ll have two bad outcomes: it won’t work very well, and all the services with a back door will be inherently more vulnerable to compromise.” The result, emphasized Blaze, is that the “crime-solving mandate is going to end up having a crime-facilitating effect.”
From a policy perspective, the best defense of encryption is China. The government needs to protect itself against Chinese hacking, and it recognizes that corporations, from utilities to financial companies, need to as well. Given those concerns, Soghoian believes that “it would be difficult for the government to engage in a face-on attack on cryptography itself.” The trade-off is that a system that’s “secure against China is secure against the F.B.I.” Zimmermann and others are counting on the government coming to terms with that bargain. “Part of our strategy from the beginning has been that we want to bring about conditions where we have a lot of government customers”—Silent Circle is used by Navy SEALs, Canadian Special Forces, and others—“and, thus, create collateral damage if the government tries to push us to put back doors in. They would be hurting themselves.”
Even ubiquitous, strong encryption would not leave the government without options when it truly needs to collect intelligence on a given target. “All cryptography ultimately depends on a secret that has to be somewhere, on some computer,” Blaze said. And sitting at that computer is a person, with his or her own vulnerabilities. There are still ways to get the secrets.
Original Article
Source: newyorker.com
Author: Matt Buchanan
No comments:
Post a Comment