“Researchers steal iPhone passwords in 6 minutes”…true…but not the whole story

By Patrick Toomey

Direct link to keychaindumper (for those that want to skip the article and get straight to the code)

So, a few weeks ago a wave of articles hit the usual sites about research that came out of the Fraunhofer Institute (yes, the MP3 folks) regrading some issues found in Apple’s Keychain service.  The vast majority of the articles, while factually accurate, didn’t quite present the full details of what the researchers found.  What the researchers actually found was more nuanced than what was reported.  But, before we get to what they actually found, let’s bring everyone up to speed on Apple’s keychain service.

Apple’s keychain service is a library/API provided by Apple that developers can use to store sensitive information on an iOS device “securely” (a similar service is provided in Mac OS X).  The idea is that instead of storing sensitive information in plaintext configuration files, developers can leverage the keychain service to have the operating system store sensitive information securely on their behalf.  We’ll get into what is meant by “securely” in a minute, but at a high level the keychain encrypts (using a unique per-device key that cannot be exported off of the device) data stored in the keychain database and attempts to restrict which applications can access the stored data.  Each application on an iOS device has a unique “application-identifier” that is cryptographically signed into the application before being submitted to the Apple App store.  The keychain service restricts which data an application can access based on this identifier.  By default, applications can only access data associated with their own application-identifier.  Apple realized this was a bit restrictive, so they also created another mechanism that can be used to share data between applications by using “keychain-access-groups”.  As an example, a developer could release two distinct applications (each with their own application-identifier) and assign each of them a shared access group.  When writing/reading data to the keychain a developer can specify which access group to use.  By default, when no access group is specified, the application will use the unique application-identifier as the access group (thus limiting access to the application itself).  Ok, so that should be all we need to know about the Keychain.  If you want to dig a little deeper Apple has a good doc here.

Ok, so we know the keychain is basically a protected storage facility that the iOS kernel delegates read/write privileges to based on the cryptographic signature of each application.  These cryptographic signatures are known as “entitlements” in iOS parlance.  Essentially, an application must have the correct entitlement to access a given item in the keychain.  So, the most obvious way to go about attacking the keychain is to figure out a way to sign fake entitlements into an application (ok, patching the kernel would be another way to go, but that is a topic for another day).  As an example, if we can sign our application with the “apple” access group then we would be able to access any keychain item stored using this access group.  Hmmm…well, it just so happens that we can do exactly that with the “ldid” tool that is available in the Cydia repositories once you Jailbreak your iOS device.  When a user Jailbreak’s their phone, the portion of the kernel responsible for validating cryptographic signatures is patched so that any signature will validate. So, ldid basically allows you to sign an application using a bogus signature. But, because it is technically signed, a Jailbroken device will honor the signature as if it were from Apple itself.

Based on the above descrption, so long as we can determine all of the access groups that were used to store items in a user’s keychain, we should be able to dump all of them, sign our own application to be a member of all of them using ldid, and then be allowed access to every single keychain item in a user’s keychain.  So, how do we go about getting a list of all the access group entitlements we will need?  Well, the kechain is nothing more than a SQLite database stored in:


And, it turns out, the access group is stored with each item that is stored in the keychain database.  We can get a complete list of these groups with the following query:


Once we have a list of all the access groups we just need to create an XML file that contains all of these groups and then sign our own application with ldid.  So, I created a tool that does exactly that called keychain_dumper.  You can first get a properly formatted XML document with all the entitlements you will need by doing the following:

./keychain_dumper -e > /var/tmp/entitlements.xml

You can then sign all of these entitlments into keychain_dumper itself (please note the lack of a space between the flag and the path argument):

ldid -S/var/tmp/entitlements.xml keychain_dumper

After that, you can dump all of the entries within the keychain:


If all of the above worked you will see numerous entries that look similar to the following:

Service: Dropbox
Account: remote
Entitlement Group: R96HGCUQ8V.*
Label: Generic
Field: data
Keychain Data: SenSiTive_PassWorD_Here

Ok, so what does any of this have to do with what was being reported on a few weeks ago?  We basically just showed that you can in fact dump all of the keychain items using a jailbroken iOS device.  Here is where the discussion is more nuanced than what was reported.  The steps we took above will only dump the entire keychain on devices that have no PIN set or are currently unlocked.  If you set a PIN on your device, lock the device, and rerun the above steps, you will find that some keychain data items are returned, while others are not.  You will find a number of entries now look like this:

Service: Dropbox
Account: remote
Entitlement Group: R96HGCUQ8V.*
Label: Generic
Field: data
Keychain Data: <Not Accessible>

This fundamental point was either glossed over or simply ignored in every single article I happend to come across (I’m sure at least one person will find the article that does mention this point :-)).  This is an important point, as it completely reframes the discussion.  The way it was reported it looks like the point is to show how insecure iOS is.  In reality the point should have been to show how security is all about trading off various factors (security, convenience, etc).  This point was not lost on Apple, and the keychain allows developers to choose the appropriate level of security for their application.  Stealing a small section from the keychain document from Apple, they allow six levels of access for a given keychain item:

CFTypeRef kSecAttrAccessibleWhenUnlocked;
CFTypeRef kSecAttrAccessibleAfterFirstUnlock;
CFTypeRef kSecAttrAccessibleAlways;
CFTypeRef kSecAttrAccessibleWhenUnlockedThisDeviceOnly;
CFTypeRef kSecAttrAccessibleAfterFirstUnlockThisDeviceOnly;
CFTypeRef kSecAttrAccessibleAlwaysThisDeviceOnly;

The names are pretty self descriptive, but the main thing to focus in on is the “WhenUnlocked” accessibility constants.  If a developer chooses the “WhenUnlocked” constant then the keychain item is encrypted using a cryptographic key that is created using the user’s PIN as well as the per-device key mentioned above.  In other words, if a device is locked, the cryptographic key material does not exist on the phone to decrypt the related keychain item.  Thus, when the device is locked, keychain_dumper, despite having the correct entitlements, does not have the ability to access keychain items stored using the “WhenUnlocked” constant.  We won’t talk about the “ThisDeviceOnly” constant, but it is basically the most strict security constant available for a keychain item, as it prevents the items from being backed up through iTunes (see the Apple docs for more detail).

If a developer does not specify an accessibility constant, a keychain item will use “kSecAttrAccessibleWhenUnlocked”, which makes the item available only when the device is unlocked.  In other words, applications that store items in the keychain using the default security settings would not have been leaked using the approach used by Fraunhofer and/or keychain_dumper (I assume we are both just using the Keychain API as it is documented).  That said, quite a few items appear to be set with “kSecAttrAccessibleAlways”.  Such items include wireless access point passwords, MS Exchange passwords, etc.  So, what was Apple thinking; why does Apple let developers choose among all these options?  Well, let’s use some pretty typical use cases to think about it.  A user boots their phone and they expect their device to connect to their wireless access point without intervention.  I guess that requires that iOS be able to retrieve their access point’s password regardless of whether the device is locked or not.  How about MS Exchange?  Let’s say I lost my iPhone on the subway this morning.  Once I get to work I let the system administrator know and they proceed to initiate a remote wipe of my Exchange data.  Oh, right, my device would have to be able to login to the Exchange server, even when locked, for that to work.  So, Apple is left in the position of having to balance the privacy of user’s data with a number of use cases where less privacy is potentially worthwhile.  We can probably go through each keychain item and debate whether Apple chose the right accessibility constant for each service, but I think the main point still stands.

Wow…that turned out to be way longer than I thought it would be.  Anyway, if you want to grab the code for keychain_dumper to reproduce the above steps yourself you can grab the code on github.  I’ve included the source as well as a binary just in case you don’t want/have the developer tools on your machine.  Hopefully this tool will be useful for security professionals that are trying to evaluate whether an application has chosen the appropriate accessibility parameters during blackbox assessments. Oh, and if you want to read the original paper by Fraunhofer you can find that here.

18 thoughts on ““Researchers steal iPhone passwords in 6 minutes”…true…but not the whole story

  1. The PIN adds only a trivial amount of entropy, though – if you could access the per-device key, then bruteforcing through the possible PINs won’t take long.

    • This is a good point, and is the exact question I’d asked myself as I started looking into this. The iPhone does have a setting that tells the phone to erase itself after X failed pin attempts (I think X is 10). In the worst case Apple might be doing the pin validation/key generation completely in the kernel. If that is case, then yes, theoretically, one could patch the kernel so that you get an unlimited number of attempts, and thus your point about minimal entropy would come into play. However, and this is total conjecture, Apple could do something that would be much more difficult to crack. The fact that each device has a per-device non-exportable key implies that this key lives in hardware somewhere (I am also unaware of anyone having gained access to this key, though please point me in the right direction if anyone knows otherwise). One could think of a construction whereby the pin validation takes place in hardware as well. If the failed pin attempt count is controlled in hardware, and even the iOS kernel is at the whim of the hardware for pin validation (as well as possibly subsequent key generation), then it would be extremely difficult, even with a JailBroken phone, to subvert the limit on pin attempts (despite the low entropy). Now, I have no idea if it is implemented that way at all. Someone else may have already very well answered this question. But, if this isn’t the way it is currently implemented, it would be kind of cool if they implemented it in the future.

  2. Nice work indeed.

    I’ve tested your keychain_dumper tool on iOS 4.3.2 (whilst locked and unlocked) and I’ve been unable to dump any email passwords fpr accounts I’ve set up (i.e. MS Exchange) – although I’ve managed to retrieve VPN and WiFi credentials.

    Just wondering if you could shed any light on the possibility that something has changed in 4.3.2 – i.e. are passwords such as those for MS Exchange accounts no longer stored in the keychain? Thanks very much.

    • I’m assuming you mean 4.3.3. I’ll try to steal a few minutes and check this for you. I would be surprised if anything has changed, but I’ll definitely check it out when I get a chance. Hope the tool has been useful for you.

      • Hmm, well, the only version of iOS I’ve tested on so far is 4.3.2. I’d be interested to see if you’re able to grab email passwords because as I said above, I’ve not been able to.

        What iOS ver did you originally test this on mate? Thanks for your time Patrick

      • Actually, come to think of it, the last version of iOS I tested this on was 4.3.1. I’ll update the blog entry for 4.3.3 results.

      • Ok, so I just realized that in my original release I was only dumping the kSecClassGenericPassword table from the keychain. I updated the tool to also dump the kSecClassInternetPassword table in the keychain. This was an extremely quick copy/paste update, so let me know if you run into any issues. This seems to now dump my gmail account from my phone. I don’t have an exchange account currently setup on my phone, so let me know if this works/doesn’t work for you. Thanks again for the heads up.

      • Ahh right, thanks very much. I know this might be asking a bit much but would it be possible for you to upload an updated binary please? Thanks again for your time and effort

      • Ok, I just committed the binary to Github. Let me know if you have any issues

  3. Great work ! Does the job on iOS 4.3.2
    Gmail account, Exchange, other apps… I have a question though; I read in Apple’s doc that to share a keychain object an app had to use the constant “kSecAttrAccessGroup”, does this mean that all those apps are using this constant ? Oh and have you had any confirmation that Fraunhofer researchers did it the same way for their article ?

    • So,
      the “kSecAttrAccessGroup” is an optional key/value pair that lets you share an element in the keychain with another application. However, this attribute is entirely opt-in. In other words, as a developer, you would define a named access group, and then all applications that also chose that named access group would share that data. The relevant section from the apple docs (found here) is as follows:

      If you want the new keychain item to be shared among multiple applications, include the kSecAttrAccessGroup key in the attributes dictionary. The value of this key must be the name of a keychain access group to which all of the programs that will share this item belong.

      When you use Xcode to create an application, Xcode adds an application-identifier entitlement to the application bundle. Keychain Services uses this entitlement to grant the application access to its own keychain items. You can also add a keychain-access-groups entitlement to the application and, in the entitlement property list file, specify an array of keychain access groups to which the application belongs. The property list file can have any name you like (for example, keychain-access-groups.plist). The Xcode build variable CODE_SIGN_ENTITLEMENTS should contain the SRCROOT relative path to the entitlement property list file. The property list file itself should be a dictionary with a top-level key called keychain-access-groups whose value is an array of strings. If you add such a property-list file to the application bundle, then the access group corresponding to the application-identifier entitlement is treated as the last element in the access groups array. If you do not include the kSecAttrAccessGroup key in the attributes dictionary when you call the SecItemAdd function to add an item to the keychain, the function uses the first access group in the array by default. If there is no kSecAttrAccessGroup key in the attributes dictionary and there is no keychain-access-groups entitlement in the application bundle, then the access group of a newly created item is the value of the application-identifier entitlement.

      So, if you don’t specifically specify an access group you will automatically inherit the access group associated with the entitlement of your signed application. Each application is signed with a unique entitlement (ex. apple applications are signed with the entitlement group “apple”). If you take a look at the code you can see that we first dump all the unique entitlements with the following SQL query against the keychain-2.db file:


      Once we have all of the distinct entitlements we then sign our application using ldid. We now have all of the necessary entitlements to query the keychain.

      With regard to the Fraunhofer group, I have no way to confirm what they used, but I would be willing to bet they used the exact same approach.

  4. Great work, does the job with iOS 4.3.3 jailbroken. Had compiled the source with SDK 4.3.

    Just wondering: I am playing around with custom recovery ramdisks, booting an iPhone (no jb) via USB tunnel and having access then to most of the files – even if the iPhone is not jailbroken and has codelock on.

    I try to find out if keychain_dumper would work in this enviroment aswell – without ldid. I have been able to run ./bruteforce this way to get the 4 digit passcode.

    When trying to run
    ./keychain_dumper -e > /var/tmp/entitlements.xml
    I get
    dyld: Library not loaded: /System/Library/Frameworks/UIKit.framework/UIKit

    Trace/BPT trap: 5

    Reason is probably cause I am booting from the ramdisk (running on my Mac). If mounting root or user partition of the iPhone this would be via
    mount_hfs /dev/disk0s1 /mnt1
    mount_hfs /dev/disk0s2s1 /mnt2

    I’ll have full access to the files then but this ain’t of any help.

    Would be great if you could share some insight – I am just curious and not really deep into that.

    • Hey,
      Are you using the tools recently released from HITB 2011 Amsterdam (http://code.google.com/p/iphone-dataprotection/)? If so, I’d be curious what your experience has been. If not, what tools are you using? I have been too occupied to take a look at the HITB tools, but I have been meaning to get familiar with them when I get a chance. I was interested to see that brute forcing the 4 digit code is possible, and basically confirms hardware is being used to generate the derived key, but it is the kernel that is enforcing the “10 failed attempts and we wipe the flash” protection.

      With regard to your specific question, I am actually not sure. It looks like you are getting an error related to dynamically linking with the UIKit framework, which actually kind of surprised me, as I don’t think I need to be linking with UIKit (edit…I just checked, and I am linking against UIKit…I’m am fairly sure that can be removed). It sounds like you are not able to link to this framework from the ramdisk. I’ll have to go down the same path you have before being able to comment more meaninfully/helpfully. In the mean time you could try recompiling, removing any unnecessary dependencies, to see if that helps. Let me know if you figure it out.

      • Hey,
        you are right. I am using custom recovery ramdisks for a while. Had then ported msftguys’s tools (http://msftguy.blogspot.com/2010_11_01_archive.html) to work in my Mac enviroment and go with iOS 4.3.x. I followed HITB and got that press release from russian Elcomsoft a few days ago. Cause it looked pretty much like the solution Jean Sigwald showed at HITB and published at googlecode I have it a shot.
        The binaries compile soso but at least bruteforce works. Takes 27 minutes on my iPhone 4 if the code is 9999 ;-). It just starts at 0000 – really cool and corfirming hardware is being used to generate the derived key.
        I did not use Jean’s ramdisk cause its on Windows, wanted to stay with my enviroment. I had been loading the bruteforce binary into my hombrew ramdisk. Most ppl dont see that even with a complex non- 4 digit passcode the root and user partition will boot, the iPhone will decrypt almost all the data ready to copy.
        That way I ran ./bruteforce – no ldid necessary, no jailbroken iPhone. Only works with 4 digit simple code of cause, pretty lame at 375 keys/sec cause it needs to use the iPhone ;-).
        So fpr almost all data the “10 failed attempts and we wipe the flash” protection is just a front door protection acuse it does not come into action at all with crd.

        If you want to give it a try and have a Mac I can put my CRD enviroment on my Dropbox for you to play with.

        I’ll play a bit the your keychaindumper, try to find out why it wont run from the crd. Its a different story when running on a jailbroken iPhone I’d guess. Paths may be different, dont know but will check.

        Sorry for my bad english :((

  5. Pingback: iPhone Forensics |  InfoSec Institute – IT Training and Information Security Resources

  6. Pingback: iPhone Forensics – on iOS 5 « SECURITYLEARN

  7. Pingback: iPhone Forensics - on iOS 5

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s