By Patrick Toomey
Direct link to keychaindumper (for those that want to skip the article and get straight to the code)
So, a few weeks ago a wave of articles hit the usual sites about research that came out of the Fraunhofer Institute (yes, the MP3 folks) regrading some issues found in Apple’s Keychain service. The vast majority of the articles, while factually accurate, didn’t quite present the full details of what the researchers found. What the researchers actually found was more nuanced than what was reported. But, before we get to what they actually found, let’s bring everyone up to speed on Apple’s keychain service.
Apple’s keychain service is a library/API provided by Apple that developers can use to store sensitive information on an iOS device “securely” (a similar service is provided in Mac OS X). The idea is that instead of storing sensitive information in plaintext configuration files, developers can leverage the keychain service to have the operating system store sensitive information securely on their behalf. We’ll get into what is meant by “securely” in a minute, but at a high level the keychain encrypts (using a unique per-device key that cannot be exported off of the device) data stored in the keychain database and attempts to restrict which applications can access the stored data. Each application on an iOS device has a unique “application-identifier” that is cryptographically signed into the application before being submitted to the Apple App store. The keychain service restricts which data an application can access based on this identifier. By default, applications can only access data associated with their own application-identifier. Apple realized this was a bit restrictive, so they also created another mechanism that can be used to share data between applications by using “keychain-access-groups”. As an example, a developer could release two distinct applications (each with their own application-identifier) and assign each of them a shared access group. When writing/reading data to the keychain a developer can specify which access group to use. By default, when no access group is specified, the application will use the unique application-identifier as the access group (thus limiting access to the application itself). Ok, so that should be all we need to know about the Keychain. If you want to dig a little deeper Apple has a good doc here.
Ok, so we know the keychain is basically a protected storage facility that the iOS kernel delegates read/write privileges to based on the cryptographic signature of each application. These cryptographic signatures are known as “entitlements” in iOS parlance. Essentially, an application must have the correct entitlement to access a given item in the keychain. So, the most obvious way to go about attacking the keychain is to figure out a way to sign fake entitlements into an application (ok, patching the kernel would be another way to go, but that is a topic for another day). As an example, if we can sign our application with the “apple” access group then we would be able to access any keychain item stored using this access group. Hmmm…well, it just so happens that we can do exactly that with the “ldid” tool that is available in the Cydia repositories once you Jailbreak your iOS device. When a user Jailbreak’s their phone, the portion of the kernel responsible for validating cryptographic signatures is patched so that any signature will validate. So, ldid basically allows you to sign an application using a bogus signature. But, because it is technically signed, a Jailbroken device will honor the signature as if it were from Apple itself.
Based on the above descrption, so long as we can determine all of the access groups that were used to store items in a user’s keychain, we should be able to dump all of them, sign our own application to be a member of all of them using ldid, and then be allowed access to every single keychain item in a user’s keychain. So, how do we go about getting a list of all the access group entitlements we will need? Well, the kechain is nothing more than a SQLite database stored in:
And, it turns out, the access group is stored with each item that is stored in the keychain database. We can get a complete list of these groups with the following query:
SELECT DISTINCT agrp FROM genp
Once we have a list of all the access groups we just need to create an XML file that contains all of these groups and then sign our own application with ldid. So, I created a tool that does exactly that called keychain_dumper. You can first get a properly formatted XML document with all the entitlements you will need by doing the following:
./keychain_dumper -e > /var/tmp/entitlements.xml
You can then sign all of these entitlments into keychain_dumper itself (please note the lack of a space between the flag and the path argument):
ldid -S/var/tmp/entitlements.xml keychain_dumper
After that, you can dump all of the entries within the keychain:
If all of the above worked you will see numerous entries that look similar to the following:
Entitlement Group: R96HGCUQ8V.*
Keychain Data: SenSiTive_PassWorD_Here
Ok, so what does any of this have to do with what was being reported on a few weeks ago? We basically just showed that you can in fact dump all of the keychain items using a jailbroken iOS device. Here is where the discussion is more nuanced than what was reported. The steps we took above will only dump the entire keychain on devices that have no PIN set or are currently unlocked. If you set a PIN on your device, lock the device, and rerun the above steps, you will find that some keychain data items are returned, while others are not. You will find a number of entries now look like this:
Entitlement Group: R96HGCUQ8V.*
Keychain Data: <Not Accessible>
This fundamental point was either glossed over or simply ignored in every single article I happend to come across (I’m sure at least one person will find the article that does mention this point :-)). This is an important point, as it completely reframes the discussion. The way it was reported it looks like the point is to show how insecure iOS is. In reality the point should have been to show how security is all about trading off various factors (security, convenience, etc). This point was not lost on Apple, and the keychain allows developers to choose the appropriate level of security for their application. Stealing a small section from the keychain document from Apple, they allow six levels of access for a given keychain item:
The names are pretty self descriptive, but the main thing to focus in on is the “WhenUnlocked” accessibility constants. If a developer chooses the “WhenUnlocked” constant then the keychain item is encrypted using a cryptographic key that is created using the user’s PIN as well as the per-device key mentioned above. In other words, if a device is locked, the cryptographic key material does not exist on the phone to decrypt the related keychain item. Thus, when the device is locked, keychain_dumper, despite having the correct entitlements, does not have the ability to access keychain items stored using the “WhenUnlocked” constant. We won’t talk about the “ThisDeviceOnly” constant, but it is basically the most strict security constant available for a keychain item, as it prevents the items from being backed up through iTunes (see the Apple docs for more detail).
If a developer does not specify an accessibility constant, a keychain item will use “kSecAttrAccessibleWhenUnlocked”, which makes the item available only when the device is unlocked. In other words, applications that store items in the keychain using the default security settings would not have been leaked using the approach used by Fraunhofer and/or keychain_dumper (I assume we are both just using the Keychain API as it is documented). That said, quite a few items appear to be set with “kSecAttrAccessibleAlways”. Such items include wireless access point passwords, MS Exchange passwords, etc. So, what was Apple thinking; why does Apple let developers choose among all these options? Well, let’s use some pretty typical use cases to think about it. A user boots their phone and they expect their device to connect to their wireless access point without intervention. I guess that requires that iOS be able to retrieve their access point’s password regardless of whether the device is locked or not. How about MS Exchange? Let’s say I lost my iPhone on the subway this morning. Once I get to work I let the system administrator know and they proceed to initiate a remote wipe of my Exchange data. Oh, right, my device would have to be able to login to the Exchange server, even when locked, for that to work. So, Apple is left in the position of having to balance the privacy of user’s data with a number of use cases where less privacy is potentially worthwhile. We can probably go through each keychain item and debate whether Apple chose the right accessibility constant for each service, but I think the main point still stands.
Wow…that turned out to be way longer than I thought it would be. Anyway, if you want to grab the code for keychain_dumper to reproduce the above steps yourself you can grab the code on github. I’ve included the source as well as a binary just in case you don’t want/have the developer tools on your machine. Hopefully this tool will be useful for security professionals that are trying to evaluate whether an application has chosen the appropriate accessibility parameters during blackbox assessments. Oh, and if you want to read the original paper by Fraunhofer you can find that here.