Thursday, October 2, 2014

About the iPhone Encryption

There have been lots of news stories about the iPhone 6 encryption, and lots of headlines claiming that it "locks out the NSA." It doesn't.

First, the bad journalism.

Signaling Post-Snowden Era, New iPhone Locks Out N.S.A.
Devoted customers of Apple products these days worry about whether the new iPhone 6 will bend in their jean pockets. The National Security Agency and the nation’s law enforcement agencies have a different concern: that the smartphone is the first of a post-Snowden generation of equipment that will disrupt their investigative abilities.
The phone encrypts emails, photos and contacts based on a complex mathematical algorithm that uses a code created by, and unique to, the phone’s user — and that Apple says it will not possess.
The result, the company is essentially saying, is that if Apple is sent a court order demanding that the contents of an iPhone 6 be provided to intelligence agencies or law enforcement, it will turn over gibberish, along with a note saying that to decode the phone’s emails, contacts and photos, investigators will have to break the code or get the code from the phone’s owner.
of course this is how the article ends instead of begins
Mr. Zdziarski said that concerns about Apple’s new encryption to hinder law enforcement seemed overblown. He said there were still plenty of ways for the police to get customer data for investigations. In the example of a kidnapping victim, the police can still request information on call records and geolocation information from phone carriers like AT&T and Verizon Wireless.
“Eliminating the iPhone as one source I don’t think is going to wreck a lot of cases,” he said. “There is such a mountain of other evidence from call logs, email logs, iCloud, Gmail logs. They’re tapping the whole Internet.”

Now some explanation.

The iPhone will automatically encrypt data stored on the device.
On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode.
Now Apple has said that they cannot decrypt data, even when asked by law enforcement or a court, but is not the end of the story.  Depending on the case the courts can force you to unlock it yourself. (See for example this court order)
In many cases, the American judicial system doesn’t view an encrypted phone as an insurmountable privacy protection for those accused of a crime. Instead, it’s seen as an obstruction of the evidence-gathering process, and a stubborn defendant or witness can be held in contempt of court and jailed for failing to unlock a phone to provide that evidence.
In some cases, the Fifth Amendment’s protection against self-incrimination may block such demands
but the few cases where suspects have pleaded the Fifth to avoid decrypting a PC—the legal equivalent of a smartphone—have had messy, sometimes contradictory outcomes.
 In some cases you can plead the Fifth 
The court ruled that forcing him to surrender his password and decryption keys would be the same as making him provide self-incriminating testimony, and let him off the hook.
But in other cases 
He refused, pleading the Fifth. A judge ruled against him, calling the contents of the computer a “foregone conclusion.” The police didn’t need Boucher’s “testimony” to get the files, in other words—they only needed him to stop obstructing access to them. 
In some situations other evidence can be considered
enough to nullify her Fifth amendment argument. As with Boucher, the judge ruled that she give police access to the files or be held in contempt.
NIST Encryption Standards

A really important, overlooked part seems to be NIST's weakened encryption standards. (Many thanks to Rayne)

**Update (November 21, 2014 EFF Joins calls for NIST reform

A reality check on encryption standards based on NIST
Let’s reset all the hype:
There is no smartphone security available on the market we can trust absolutely to keep out the National Security Agency. No password or biometric security can assure the encryption contained in today’s smartphones as long as they are built on current National Institute of Standards and Technology (NIST) standards and/or the Trusted Computing Platform. The NSA has compromised these standards and TCP in several ways, weakening their effectiveness and ultimately allowing a backdoor through them for NSA use, bypassing any superficial security system.
There is nothing keeping the NSA from sharing whatever information they are gleaning from smartphones with other government agencies. Citizens may believe that information gleaned by the NSA ostensibly for counterterrorism may not be legally shared with other government agencies, but legality/illegality of such sharing does not mean it hasn’t and isn’t done. (Remember fusion centers, where government agencies were supposed to be able to share antiterrorism information? Perhaps these are merely window dressing on much broader sharing.)
There is no exception across the best known mobile operating systems to the vulnerability of smartphones to NSA’s domestic spying.
More on NIST from Rayne

On NSA’s Subversion of NIST’s Algorithm

Our security is only as good as the tools we use to protect it, and compromising a widely used cryptography algorithm makes many Internet communications insecure.
Improving the security of cryptographic standards is an issue where the equities overwhelmingly lie on one side of the equation. By increasing the funding for—and thus capabilities in—NIST’s Computer Security Division, Congress can help restore confidence in NIST’s cryptographic standards efforts. This is a win for all.

The NSA, NIST and the AMS

Among the many disturbing aspects of the behavior of the NSA revealed by the Snowden documents, the most controversial one directly relevant to mathematicians was the story of the NSA’s involvement in a flawed NIST cryptography standard.
this is a clearly identifiable case where mathematicians seem to have been involved in using their expertise to subvert the group tasked with producing high quality cryptography.
Matt Green on NIST
In this post I'm going to try to explain the curious story of Dual-EC. While I'll do my best to keep this discussion at a high and non-mathematical level, be forewarned that I'm probably going to fail at least at a couple of points. I you're not the mood for all that, here's a short summary:
  • In 2005-2006 NIST and NSA released a pseudorandom number generator based on elliptic curve cryptography. They released this standard -- with very little explanation -- both in the US and abroad
  • This RNG has some serious issues with just being a good RNG. The presence of such obvious bugs was mysterious to cryptographers.
  • In 2007 a pair of Microsoft researchers pointed out that these vulnerabilities combined to produce a perfect storm, which -- together with some knowledge that only NIST/NSA might have -- opened a perfect backdoor into the random number generator itself.
  • This backdoor may allow the NSA to break nearly any cryptographic system that uses it. 

While encrypting data stored on the device is great, this is different from encrypting data like phone calls and internet activity.  Thankfully a new app called Signal from hacker security researcher Moxie Marlinspike is now available for iPhone (it's been on Android for four years already).
If you’re making a phone call with your iPhone, you used to have two options: Accept the notion that any wiretapper, hacker or spook can listen in on your conversations, or pay for pricey voice encryption software.
Like any new and relatively untested crypto app, users shouldn’t entirely trust Signal’s security until other researchers have had a chance to examine it. Marlinspike admits “there are always unknowns,” such as vulnerabilities in the software of the iPhone that could allow snooping. But in terms of preventing an eavesdropper on the phone’s network from intercepting calls, Signal’s security protections are “probably pretty great,” he says.
After all, the technology behind Signal isn’t exactly new. Marlinspike first took on the problem of smartphone voice encryption four years ago with Redphone, an Android app designed to foil all wiretaps.
Another interesting question I have is what if any effect June's Supreme Court decision in Riley v California will have on iPhone, but I assume for now that the answer is the same--5th Amendment depends on the situation.  I will try to get some more answers.  For now see below.

NYT Major Ruling Shields Privacy of CellphonesSupreme Court Says Phones Can’t Be Searched Without a Warrant
“Cellphones have become important tools in facilitating coordination and communication among members of criminal enterprises, and can provide valuable incriminating information about dangerous criminals,” he wrote. “Privacy comes at a cost.”
But other technologies, he said, can make it easier for the police to obtain warrants. Using email and iPads, the chief justice wrote, officers can sometimes have a warrant in hand in 15 minutes.
What must the police do when they want to search a cellphone in connection with an arrest?
“Get a warrant,” Chief Justice Roberts wrote. 
Marcy Wheeler explains that
In real life, it’s likely that cops will integrate cellphone search warrants into their arrest warrant process. And Roberts’ opinion allows police to invoke exigent circumstances to search a phone. But at a minimum, this ruling will prohibit suspicion-less searches of cellphones.
A different part of Sotomayor’s concurrence, arguing that the existing precedent holding that you don’t have a privacy interest in data you’ve given to a third party “is ill suited to the digital age,” has been invoked repeatedly in privacy debates since she wrote it. That’s especially true since the beginning of Edward Snowden’s leaks. Lawsuits against the phone dragnet often cite that passage, arguing that the phone dragnet is precisely the kind of intrusion that far exceeds the intent of old precedent. And the courts have – with the exception of one decision finding the phone dragnet unconstitutional – ruled that until a majority on the Supreme Court endorses this notion, the old precedents hold.
Roberts cited from a different part of Sotomayor’s opinion, discussing how much GPS data on our movements reveals about our personal lives. That appears amid a discussion in which he cites things that make cellphones different: the multiple functions they serve, the different kinds of data we store in the same place, our Web search terms, location and apps that might betray political affiliation, health data or religion. That is, in an opinion joined by all his colleagues, the chief justice repeats Sotomayor’s argument that the sheer volume of this information makes it different.
That by no means says that those challenging the government’s national security surveillance will prevail by pointing to this opinion. Roberts includes an incredibly pregnant footnote, clarifying that “these cases do not implicate the question whether the collection or inspection of aggregated digital information amounts to a search under other circumstances.” Without naming the third-party doctrine explicitly, with his invocation of “search” Roberts makes it clear that’s what he’s discussing.
Here Marcy says Roberts kept it vague on purpose.

So for now, these cases about data on a smartphone or GPS collection are not being used to end NSA collection, which is still being reauthorized every 90 days.

The bill currently getting all the attention to reform the NSA is Senator Leahy's USA Freedom Act, but as Marcy has well documented, the proposed reforms are actually making some problems worse.
The ACLU and EFF normally do great work defending the Fourth Amendment. Both have fought the government’s expansive spying for years. Both have fought hard to require the government obtain a warrant before accessing your computer, cell phone, and location data.
by outsourcing to telecoms, NSA will actually increase the total percentage of Americans’ telephone records that get chained on; sources say it will be more “comprehensive” than the current dragnet and Deputy NSA Director Richard Ledgett agrees the “the actual universe of potential calls that could be queried against is [potentially] dramatically larger.” In addition, the telecoms are unlikely to be able to remove all the noisy numbers like pizza joints — as NSA currently claims to – meaning more people with completely accidental phone ties to suspects will get sucked in. And USA Freedom adopts a standard for data retention — foreign intelligence purpose — that has proven meaningless in the past
But earlier this week, they may have taken action that directly undermines that good work.

So data on the iPhone is now automatically encrypted, but that won't stop police from issuing warrants or courts forcing you to unlock the data yourself.  Phone calls can now be encrypted for free using the Signal app.  All of these are great improvements, but it is still not the end of the story.

Many security experts have focused on iCloud storage.  Micah Lee writes at The Intercept that
despite these nods to privacy-conscious consumers, Apple still strongly encourages all its users to sign up for and use iCloud, the internet syncing and storage service where Apple has the capability to unlock key data like backups, documents, contacts, and calendar information in response to a government demand. iCloud is also used to sync photos, as a slew of celebrities learned in recent weeks when hackers reaped nude photos from the Apple service. (Celebrity iCloud accounts were compromised when hackers answered security questions correctly or tricked victims into giving up their credentials via “phishing” links, Cook has said.)
The most prominent privacy improvement Apple made to its products last week is a new encryption feature built-in to iOS 8.
Since the iPhone 3GS, all iOS devices have supported encrypting personal data such as text messages, photos, emails, contacts, and call history. If you set a passcode it would be used to encrypt some, but not all, of the data on your device. Apple was still able to decrypt some of the data without knowing your passcode.
If law enforcement confiscated your phone and wanted to snoop at its data, all they would have to do is serve Apple a warrant and to get a copy of the plaintext data.
The improved encryption in iOS 8 is a great move towards protecting consumer privacy and security. But users should be aware that in most cases it doesn’t protect your iOS device from government snoops.
While Apple does not have the crypto keys that can unlock the data on iOS 8 devices, they do have access to your iCloud backup data.
this is not the first time 
This isn’t the first time that Apple has oversold the security of its products. Shortly after the PRISM revelations were published in The Washington Postand The Guardian, Apple denied that it was part of the program and issued a statement claiming that “conversations which take place over iMessage and FaceTime are protected by end-to-end encryption so no one but the sender and receiver can see or read them. Apple cannot decrypt that data.” But security researchers showed that Apple could indeed eavesdrop on iMessage conversations without the user knowing.
ArsTechnica notes that
Apple executives never mentioned the words "iCloud security" during the unveiling of the iPhone 6
In the name of security, we did a little testing using family members as guinea pigs. To demonstrate just how much private information on an iPhone can be currently pulled from iCloud and other sources, we enlisted the help of a pair of software tools from Elcomsoft. These tools are essentially professional-level, forensic software used by law enforcement and other organizations to collect data. But to show that an attacker wouldn’t necessarily need that to gain access to phone data, we also used a pair of simpler “hacks,” attacking a family member’s account (again, with permission) by using only an iPhone and iTunes running on a Windows machine.
As things stand right now, a determined attacker will still find plenty of ways to get to iPhone data. They need to gain physical access to the device, or harvest or crack credentials to do so. But there are ways to do this that won't alert the victim. The weakest links are components of the iCloud service.
passwords are essential but not impossible
We also went after a password-encrypted version of the backup on a local drive using EPPB’s dictionary and brute-force password attacks, cracking the seven-letter password after about two days
since the iCloud backup is only protected by the iCloud password right now, once someone has obtained that password, everything in that backup is wide open.
And there’s a lot in that backup.
There are a number of things some people might be surprised to find in the iCloud backups. Among the data found were:
  • SQLite databases containing phone call history, SMS and iMessage messages, and voicemail message data (with the number they were from and timestamps for when they were trashed) dating back to the phone's original purchase. So much for deleting call history.
  • A file called “recents” that contained e-mail, Messenger, and SMS addresses with message header data and other information.
  • An “accounts” database with all the e-mail, Twitter, and Apple-associated identity accounts we've ever held. Some details synced over from accounts closed before the target phone was purchased.
  • A file with all “known” Wi-Fi hotspots, with the SSIDs and MAC addresses of every hotspot the phone ever connected to.
  • Images, many believed to be long deleted, in three separate photo folders on each backup. All of the images carried the default EXIF data that Apple’s camera app attaches to them: dates taken, GPS latitude, longitude, and altitude. These images, in our oldest iCloud backup, were part of a much older incremental backup that had not been cleared from the cloud, and were found in a duplicate image folder within the DCIM folder of the backup image.
  • A file containing Apple Maps addresses searched for.
  • Mailbox files for the e-mail accounts used with Apple’s Mail app.
  • An address book database with over 1,000 e-mail addresses, phone numbers, Facebook profile links, and other contact data.
That is just what we found sifting around for a few hours aimlessly. It’s clear that anyone targeted by an iCloud account hack hasn’t just had pictures exposed; their entire digital lives have been laid out on display.
and real-time tracking via "Find my iPhone"
Even creepier, the iCloud access also gives the attacker the ability to stalk the victim in real-time by using the Find My iPhone feature. If the phone is turned on and Find My iPhone was configured, the attacker can use the feature just as the owner would (of course, odds are that it’s on the owner’s person). We were able to identify the location of family members in this way as soon as the target phone was turned on. None of this is particularly high-tech. And it’s well within the threshold of pain for a mildly technically literate, very obsessed attacker.
Apple could go a long way toward protecting customer privacy just by adding a second credential to encrypt stored iCloud data. An encryption password could be used to decrypt the backup when downloaded to iTunes or to the device, or it could be used to decrypt the data as it is read by iCloud to stream down to the device. That would at least give backups the same level of protection that they get when stored locally with encryption (already an option in iTunes).
this still won't stop NSA or police
These measures will not mean that the police, the FBI, or the NSA couldn’t get to your iPhone data if they had a need to. The fixes won't stop a determined attacker from finding other ways to compromise a user’s devices to gain access to information. But these tweaks raise the level of effort required enough to deter casual attacks, and they will hopefully raise people’s awareness to attacks in progress early enough to react.
Mashable has a good selection of security researches showing how police can still get your data, then there is a list of reasons not to trust Apple,

For the NSA, Four-hundred-thousand apps means 400,000 possibilities for attacks.  Apps can be used to spy on what users do elsewhere on the phone.  The NSA can replay phone calls, and is as I noted earlier still collecting phone metadata every 90 days.
all call detail records or "telephony metadata" created by Verizon for communications (i) between the United States and abroad; or (ii) wholly within the United States, including local telephone calls. This Order does not require Verizon to produce telephony metadata for communications wholly originating and terminating in foreign countries. Telephony metadata includes comprehensive communications routing information, including but not limited to session identifying information (e.g., originating and
terminating telephone number, International Mobile Subscriber Identity (IMSI) number, International Mobile station Equipment Identity (IMEI) number, etc.), trunk identifier, telephone calling card numbers, and time and duration of call.

No comments:

Post a Comment