Yesterday a judge in Los Angeles ordered Apple to help the FBI unlock an iPhone belonging to one of the two terrorists who killed 14 people and seriously injured over twenty others in San Bernardino, Calif., in December. But Apple CEO Tim Cook is saying no, sparking a legal battle that may determine what rights the government has to the information inside your phone.
The FBI has complained for more than a year that it is crippled by its inability to break into encrypted messages or Internet traffic, or read a growing volume of what criminals and terrorists are saying and communicating online. FBI Director James Comey is probably the country's most outspoken advocate for giving the government a way to decrypt those messages by gaining access through a process commonly known as a "backdoor." But the consensus of the tech community and privacy advocates is that any backdoor would create a hole in encryption, making it vulnerable not just to government abuse but also hackers and cyber-criminals.
The fight over the San Bernadino iPhone is slightly different, but closely related and with similar implications for consumer privacy and security. Comey mentioned at a Senate Intelligence Committee hearing last week that the FBI still could not unlock the iPhone of Syed Farook, one of the two San Bernardino shooters. That's because the Apple operating system is now built so that virtually all of the phone's data is encrypted by, in simple terms, the combination of the user's password and a "key" stored on the phone itself. Without the password, no one, including Apple, can get into that data. So rather than ask Apple to break the encryption itself, Judge Sheri Pym ordered the company to write new software that would allow the FBI to enter an unlimited number of passwords without triggering the phone's automatic data wipe. If the agency had that software and could use another computer to enter passwords quickly, it could likely unlock the device within a relatively short period. But Apple refused.
The court order says Apple would only have to build software that works specifically on Farook's phone, and points out that the San Bernardino county government, which employed Farook, issued the phone and consented to searches. But Cook said there was no verifiable way to limit things to just one phone. "The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor," wrote Cook in a letter to customers posted overnight. "While the government may argue that its use would be limited to this case, there is no way to guarantee such control." Search company DuckDuckGo sent a message supporting Cook on social media, as did WhatsApp founder Jan Koum.
The larger and perhaps more fundamental issue is a legal one. If the government can order Apple to write new code to open Farook's phone—Cook wrote that the software the government is asking for "does not exist today"—what stops it from ordering the makers of messaging apps to write new code to change their encryption schemes, or demanding similar software from any other company? Privacy advocates say such limits don't appear to exist.
"Just going off the language of yesterday's order, I don't know why the government couldn't seek such [an order]," says Steve Vladeck, a professor of national security law at American University's Washington College of Law. He acknowledges that the government can and does require tech companies to aid investigations, but usually by doing things that are already within their power. But the order to write entirely new software, Vladeck says, may cross an "important line" about what the government can demand from companies—and suggests that the answer is it can get anything it wants.
"Either there is some substantive limit on just how much a court can order a third party like Apple to do, or there isn't," he says. "I happen to think the answer is there's a substantive limit, but I didn't see that in yesterday's order."
Kevin Bankston, the director of the Open Technology Institute at the left-leaning New America Foundation, agrees. "If a court can legally compel Apple to do that, then it likely could also legally compel any other software provider to do the same," he wrote to Mother Jones. "If this precedent gets set it will spell digital disaster for the trustworthiness of everyone’s computers and mobile phones."
The FBI did not respond to a request for comment.
Then there is the concern that the court order could create a dangerous precedent. "The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge," Cook wrote in his letter. Christopher Soghoian, the principal technologist at the American Civil Liberties Union, concurred in a tweet posted on Wednesday morning.
Privacy advocates and security hawks are both gearing up for a fight. In addition to Cook's blunt letter, numerous digital privacy groups condemned the order on Wednesday. One group, Fight for the Future, is organizing rallies outside of Apple stores next Tuesday to support the company's stand and protest against the court ruling.
GOP presidential frontrunner Donald Trump immediately attacked Apple for its stance. "Who do they think they are?" he asked during an interview on Fox News. Trump and other Republican presidential hopefuls have previously called for expanding government surveillance powers. Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.), who are the top two members of the Senate Intelligence Committee and outspoken backdoor proponents, both called on Apple to obey the judge. "Under a valid court order, Apple has been asked by the FBI to unlock a government owned cell phone to assist in the investigation of a terror attack that killed 14 Americans," Burr said on Wednesday. "Court orders are not optional and Apple should comply."
Original Article
Source: motherjones.com/
Author: Max J. Rosenthal
The FBI has complained for more than a year that it is crippled by its inability to break into encrypted messages or Internet traffic, or read a growing volume of what criminals and terrorists are saying and communicating online. FBI Director James Comey is probably the country's most outspoken advocate for giving the government a way to decrypt those messages by gaining access through a process commonly known as a "backdoor." But the consensus of the tech community and privacy advocates is that any backdoor would create a hole in encryption, making it vulnerable not just to government abuse but also hackers and cyber-criminals.
The fight over the San Bernadino iPhone is slightly different, but closely related and with similar implications for consumer privacy and security. Comey mentioned at a Senate Intelligence Committee hearing last week that the FBI still could not unlock the iPhone of Syed Farook, one of the two San Bernardino shooters. That's because the Apple operating system is now built so that virtually all of the phone's data is encrypted by, in simple terms, the combination of the user's password and a "key" stored on the phone itself. Without the password, no one, including Apple, can get into that data. So rather than ask Apple to break the encryption itself, Judge Sheri Pym ordered the company to write new software that would allow the FBI to enter an unlimited number of passwords without triggering the phone's automatic data wipe. If the agency had that software and could use another computer to enter passwords quickly, it could likely unlock the device within a relatively short period. But Apple refused.
The court order says Apple would only have to build software that works specifically on Farook's phone, and points out that the San Bernardino county government, which employed Farook, issued the phone and consented to searches. But Cook said there was no verifiable way to limit things to just one phone. "The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor," wrote Cook in a letter to customers posted overnight. "While the government may argue that its use would be limited to this case, there is no way to guarantee such control." Search company DuckDuckGo sent a message supporting Cook on social media, as did WhatsApp founder Jan Koum.
The larger and perhaps more fundamental issue is a legal one. If the government can order Apple to write new code to open Farook's phone—Cook wrote that the software the government is asking for "does not exist today"—what stops it from ordering the makers of messaging apps to write new code to change their encryption schemes, or demanding similar software from any other company? Privacy advocates say such limits don't appear to exist.
"Just going off the language of yesterday's order, I don't know why the government couldn't seek such [an order]," says Steve Vladeck, a professor of national security law at American University's Washington College of Law. He acknowledges that the government can and does require tech companies to aid investigations, but usually by doing things that are already within their power. But the order to write entirely new software, Vladeck says, may cross an "important line" about what the government can demand from companies—and suggests that the answer is it can get anything it wants.
"Either there is some substantive limit on just how much a court can order a third party like Apple to do, or there isn't," he says. "I happen to think the answer is there's a substantive limit, but I didn't see that in yesterday's order."
Kevin Bankston, the director of the Open Technology Institute at the left-leaning New America Foundation, agrees. "If a court can legally compel Apple to do that, then it likely could also legally compel any other software provider to do the same," he wrote to Mother Jones. "If this precedent gets set it will spell digital disaster for the trustworthiness of everyone’s computers and mobile phones."
The FBI did not respond to a request for comment.
Then there is the concern that the court order could create a dangerous precedent. "The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge," Cook wrote in his letter. Christopher Soghoian, the principal technologist at the American Civil Liberties Union, concurred in a tweet posted on Wednesday morning.
Privacy advocates and security hawks are both gearing up for a fight. In addition to Cook's blunt letter, numerous digital privacy groups condemned the order on Wednesday. One group, Fight for the Future, is organizing rallies outside of Apple stores next Tuesday to support the company's stand and protest against the court ruling.
GOP presidential frontrunner Donald Trump immediately attacked Apple for its stance. "Who do they think they are?" he asked during an interview on Fox News. Trump and other Republican presidential hopefuls have previously called for expanding government surveillance powers. Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.), who are the top two members of the Senate Intelligence Committee and outspoken backdoor proponents, both called on Apple to obey the judge. "Under a valid court order, Apple has been asked by the FBI to unlock a government owned cell phone to assist in the investigation of a terror attack that killed 14 Americans," Burr said on Wednesday. "Court orders are not optional and Apple should comply."
Original Article
Source: motherjones.com/
Author: Max J. Rosenthal
No comments:
Post a Comment