Pentests; Unit Tests for Security?

In software and systems development… scratch that. Whenever you make any complex device or system it’s best practice to test the parts, and then test the completed system. Testing the parts is often referred to as Unit Testing; the system, well, System testing.

For many reasons clients ask for penetration testing for some small unit or group. Typically they also have some set of rules (ROE) as well that further limit the testing. At this point this is getting awfully close to QA acceptance testing (here’s your test plan, stick to it).

So my question is, if vulnerability assessments are routine validation, and penetration tests are essentially Unit Tests of particular sections of the environment, where’s the System Test? Wait, scratch that too. The system test is the one you get for free; at least until the incident response team shows up.

Memorial Day Thank You

Welcome back from Memorial Day; and for those in service, their family, friends, and loved ones, Thank You for your sacrifice.

Memorial Day always marks a couple of things for me personally. A time for remembrance. A time to re-educate myself on the sacrifice others have made for the idea of a strong and free country. It also marks, at least in my mind, the start of the spin up of the security community leading towards events in late July. I this it’s good every once and a while to give pause, and take stock of how what we do in Information Security fits into those same ideals of a strong and free country. How can we do our part better?

And to learn about some of those who did their part, I encourage you to read

Symbiotic Relationships in Security

In his keynote at Shmoocon Peiter Zatko brought up a pretty cool example applying game theory principals to security. The redux was “if you think the game is illogical, you’re probably missing something”.

In this case the game involved the new feedstock of AV companies; Botnets. His observation; the botnets writers had implemented a simple XOR cipher that was pretty obvious as part of their protection scheme. This cipher had roughly a ten day lifecycle from release to detection to AV signature release. When the ten days were up the botnet writers would alter the cipher and the cycle would start again.

At one point, the cipher was changed to a stronger AES version. After ten days, nothing. So what happened? According to Zatko, the weaker cipher system was put back. Nuts right?

The theory is that it takes a few minutes to change the cipher, and ten days to get caught. That’s a pretty good investment and the AV companies are happy because they get to say they’re releasing sigs all the time. If you take away the low hanging fruit, maybe the AV companies will figure out how to block the bot in a more fundamental way. Something that takes more than five minutes to fix.

So in the end it’s in both parties interest to keep the cycle alive. AV isn’t providing a solution, and everyone is making money prolonging the problem. And the ironic part is the AV client is probably happier too. I mean, look at all these signatures I’ve applied this month, what a good investment.

I guess the bright side is guys at DARPA like Mudge are digging into this, taking it seriously, and most importantly, being given the latitude to talk about what they’re seeing.

(Red)Herring v. United States Revisited

By N. Puffer

It’s just about two years since the Supreme Court decided Herring, so I figured I’d take a look back and see what, if any, impact it’s had. At the time of the finding several people in the infosec community were worried by Justice Ginsburg’s dissent; which pointed out the dangers of removing any penalty for a lack of integrity in systems used by law enforcement.

To recap the story so far …

In 1981 a few cops in California were tracking down some drug dealers. While watching the people come and go they identified a couple of other people, got the appropriate warrants, and arrested them. The problem was the courts pencil whipped the warrant; the police didn’t actually meet the needed rigor for a search. A lot of lawyers did a lot of talking, and we ended up with United States v. Leon (1984) saying that if the police were acting in good faith (in this case they were), then the exclusionary rule doesn’t apply. Sounds scary, and Orwell would love the coincidence in dates. What the courts really seem to be saying is that the Justice system is run by people, and has for a long time acknowledged that people are fallible (appeals process). If good work comes from an honest mistake society shouldn’t be punished by letting a drug dealer go free.

Of course, detractors may say that ignorance of the law is no excuse for citizens (I didn’t know the speed limit officer), and that members of the justice system should be held to a higher standard when it comes to a persons Freedom. Fair points. Feel free to discuss among yourselves.

Fast forward to 2004, the age of networked policing, and Alabama. Mr. Herring goes to the Coffee County police station to pick up an impounded vehicle. As part of a routine warrant check, neighboring Dale County tells a Coffee County investigator that there’s an outstanding warrant for Mr. Herring. The vehicle is searched, weapons and meth are found, hilarity does not ensue. The issue? Turns out Dale County had made a mistake in their data entry. The warrant had been recalled months prior to the incident, but the system of reference (the database) was incorrect. Part of the process of warrant notification includes pulling the actual paper warrant (system of record) and faxing it. When it was discovered that the paper warrant didn’t exist a check was performed and Coffee County was notified. Elapsed time to correct the mistake, 15 minutes. Time served by Mr. Herring, 27 months.

Five years of legal workings later, and a 5-4 ruling of the Supreme Court upheld Mr. Herring’s conviction. The central supporting opinion seemed to reach back to Leon. However, the dissenting opinion submitted by Justice Ginsburg touched on what caused the watchdogs to perk up. As mentioned above, from an information management point of view, the police actions by Dale County were fundamentally flawed. Specifically, a system of reference was used to trigger a critical action, even though procedurally, a system of record needed to be consulted. Given the fact that the mistake only caused a 15 minute delay, it’s reasonable to assume that there would have been no tangible impact to law enforcement if both systems were checked, but that wasn’t really the point.

The issue seemed to be, as decided, do errors in information systems (court records) extend good faith exceptions to the exclusionary rule? From section ‘A’ of Justice Ginsburg’s dissent, “Is it not altogether obvious that the Department could take further precautions to ensure the integrity of its database? The Sheriff’s Department is in a position to remedy the situation and might well do so if the exclusionary rule is there to remove the incentive to do otherwise.”

So, everyone agrees that there was a mistake in the Dale County records. It’s also agreed that the mistakes were negligent (Justice Roberts Opinion), and the courts ruled that even though the system was flawed, it wasn’t enough to exclude the results of the system. Furthermore, according to Justice Ginsberg, there’s no incentive to fix the problem. And now we’re back to the beginning; Police don’t need to ensure that systems are accurate, much less secure, as long as they aren’t complicit. There’s no motivation to ensure system integrity. In fact, there’s a motivation to not know how bad the systems are; if you knew, you may have to fix it.

Yet in the past two years there’s no evidence that law enforcement is purposely letting their systems atrophy to game the courts. A search of citations brings up McDonald v City of Chicago, which has a tangential citation of Herring in a right to bear arms case. US v Farias-Gonzales also comes up. This is a case concerning unreasonable search and seizure, but the most technological part of the case is that a portable fingerprint scanner was used.

People v Branner also comes up, and yet again the issue is with people and not systems. In this case cops working on an outdated knowledge of judicial findings. And you can keep searching; Montejo v Louisiana, People v. Lopez; all dealing with people or straight forward citations.

People v. Washebek filed in November of 2010, comes closer. Here the prosecution successfully argued that Herring rejected the distinction between law-enforcement error and errors in court records. The facts of the case concerned a search based on probation status that was incorrect, a mistake in probationary record-keeping.

So in two years, that’s a single similar filing, and nothing about widespread flaws in a law enforcement system. There are of course, other writings about this ruling. Some feel this is just the inevitable march of the court towards killing the exclusionary rule all together. Others feel these are the necessary and correct interpretations of the Constitution meant to keep us secure through the actions of police. In either case it seems clear that there wasn’t an overarching trend towards a purposeful degradation of integrity or promotion of ignorance with regard to the security of critical law enforcement systems.

But why not? Perhaps it’s because there’s another motivator involved. The police don’t just interact with the courts as a consumer of data to enforce laws, they also place information into those same systems. Lack of integrity works both ways in most cases, and it is naturally in the best interest of the police to have a system that accurately represents the real world. I can’t imagine a cop would be happy if the paperwork they filed to finish off some good police work vanished, or appeared vanished to the prosecution.

And as far as security? Well the same forces likely apply. While it may benefit police to occasionally get a pass based on errors, this doesn’t seem to outweigh the risks of having a system that can be manipulated. So in the end, while Justice Ginsburg makes an insightful point, there may be additional sides to the story that the courts were not asked to consider.


During peer review I was asked, “so what” for the rest of the world that’s not in law enforcement. Fair point. On one hand this was an interesting case in the context of digital forensics and the legal system. The author also likes to consider issues that impact overarching trends in information security, especially when they impact our Freedom. However, that is admittedly self-indulgent and this isn’t a legal blog. If you wanted to abstract a theme to corporate motivations, I’d  ask “Are you considering the value of information relative to ensuring its integrity?”. More specifically, in what situation would you reasonably stand to benefit from the lack of integrity of your information systems? If there’s interest in expanding here leave a note in the comments and we can follow up…

Reality Check

By Nat Puffer

If I have to sit in on one more meeting where a security consultant bitches about how dumb people are, quotes a t-shirt they saw at DefCon, or makes some vague statement on how business just don’t “get it” I’m going to quietly get up, walk over to them, and throw their laptop out the closest window. If by chance there are no windows in their mother’s basement / bar / secret lair said laptop will meet an untimely end against the first semi-rigid surface I see.

Here’s a little breakdown of the reality of the world Nietzsche style. Penetration testers, computer security consultants, appsec gurus, wifi wizards; we are the Destroyers. Few if any of us are Builders. I’m not even going to bother telling you why this is. If you don’t get it, buy me a beer and we can have a pleasant conversation about it. All you need to know at this juncture is that we’re massively outnumbered, and that’s a good thing. Most people want to build things; Create; Discover. It seems to be in their nature. A few of us look to see how things work; to discover the puzzle in them by picking them apart rather than building our own. It’s in our nature.

If you can accept that then you should be able to accept this. Security will *never* be in the nature of creative people who concentrate on making things. Asking them to change is pointless. It’s our job to figure out how to let them be creative and, where necessary, keep the results safe. This will be inefficient, difficult, and iterative. That’s the business you’re in.

Think I’m nuts? The next time you’re in the bookstore killing time on the way to your client take a look at the computer section. If it’s anything like the one in San Francisco you may notice that the number of books about how to create and use software looks something like this:

Computer Section

And that the sub-set of books on computer security looks something like this:

Security Section

Lighting Clouds – Even Taser has a ‘Cloud’ Offering

By: Nat Puffer

I’m not sure if it’s serendipity or if SXSW was particularly rowdy this year, but I ran across several people on TWiT talking about Taser’s new offering to support their AXON system. If you’re interested in the original interview with Jason Droege check it out here.  This story piqued my interest since this particular offering is the convergence of so many emergent areas. It’s almost like Taser decided to take on every possible hot topic all at once, but a bit of background first.

For a while Taser has been looking for ways to take the windfall from non-lethal weaponry and branch out. One of their solutions almost seems too obvious. The AXON system is a combination of a headmounted camera, buffering system, and tactical computer. The general idea is that the benefits of the dashcam for officers in the field has been proven, so why not have that capability for an officer away from their car? The added benefit is POV coverage. What better way to judge after the fact if a tazing was justified than if you can see and hear what the officer saw in full HD.

If you have questions about police privacy, or citizen’s privacy, I definitely think there’s healthy debate to be had. Just not here. I’m more interested in the technological and legal ramifications with where this video goes.

As mentioned on the AXON site, in order to use the system you need to be able to upload the video to Taser’s cloud, namely Reports earlier this year indicate that this back-end offering will be made available through partnerships with Cisco and Equinix. The general flow of data will then be from the headset to the recorder, where buffering and hashing takes place, to the docking station, to From here two copies of the video will be placed in two datacenters, and things get interesting.

From, you can see what appears to me a licensed API of google maps with nodes for all the video uploaded that meet a set of criteria. Based on the online demo from the site, you can also see deployments and actions. Things like “Swat Deployed” with an address. You can also review clips, create subclips, and export them. All very cool considering this is remote HD video.

So what are the hop topics that Taser is taking on? First of all, they’ve built the cloud themselves. So all the issues with multi-tenancy, security, auditing, availability, redundancy, etc are issues they need to solve.

Storage and latency issues? This isn’t your tweets they’re storing. This is full HD video that needs to be taken in and played back from remote systems. Now the playback seems to be Flash based, but still, there has to be an expectation that this is going to be a fair amount of data that’s moved around. From the demo you’re looking at a 2x to 3x size requirement for each file, since you store it in two places then again for playback. Not to mention all the logging that needs to be done to satisfy chain of custody and integrity.

Storing critical data? Check. This is video for trial that may make the difference between guilt or innocence in a jury’s eyes. In addition, the logistical data about current operations would seem very sensitive. Furthermore, the video is GPS tagged. Want to see the patrol routes of cops in your city? Let me compile that for you.

Legal issues? And then some. This is evidence after all. So you need to make sure that you’re not only abiding by Federal statutes, but all local requirements for each jurisdiction as well.

I hope they’re open with how things go, because I think they’re going to be the use-case for many issues in the community going forward. How are they going to prove they are securing the data? SAS70 Type II Reviews? How will they handle growth? Segment the service by paired data centers? What happens when someone terminates service? Do you get 50 Blue-Ray disks of data? How will you handle misuse and attacks against the system?

I’d love to see Taser take this combination of risks they have and start looking at better ways to test themselves. Full scope, open ended pentesting of the entire system would be a great start. Combined with a real time Risk model they might be able to not only provide precedent for Cloud issues, but for emerging Risk issues as well.

Configure all the Systems

By: Nat Puffer

On a recent internal penetration test two basic issues were identified when reviewing automated scan results. The first a series of web servers with odd IP allocations. No vulnerabilities were reported for the web server, but they were in an IP block on a segment that was primarily network infrastructure. In addition, the operating system was listed as ‘EthernetBoard OkiLAN 8100e’. A bit of Google time with the manufacturer information and it was clear that this was a management interface for a fibre channel card; in this case, a set enabling a SAN.

Thirty seconds with the card manual gave up the following:

A couple of seconds later and I was in a position to reconfigure and restart the the SAN fibre channel cards. Hilarity would not follow.

A few minutes later the second issue of this type was revealed. The punchline of this one was username ‘apc’ password ‘apc’, and the ability to turn off power to a set of servers. This information is also easily obtainable with a quick search.

Is any of this new? No. The fact that this issue is so old is actually what I found shocking. In reality, combos like root::Password and vendor::vendor are among the first that are tried when a new interface is found. The only reason I’ve highlighted the ability to look up this information is to demonstrate that it’s available to anyone researching the device.

As a pentester, I made the client aware and moved on. Relative to the other items we found (Domain Admin was enjoyed by all), it’s likely that the readout won’t warrant more than a passing comment. In addition, when the inventory of web servers is performed, do you think that “FibreChannel Card 01” will appear on the list? Right. So when the internal audit comes looking for appropriate hardening and configuration, what’s the likelihood this is making onto the list? Right again.

However, these issues scream “Disgruntled employee. Please come play with me!”. I don’t want to be a harbinger of FUD, but while the chances are minimal the risk is there. It would take a few minutes to disable the HTTP interface or change the password (set Password xxx, pg 107 of the manual if you were wondering). This is basic due diligence, and an hour learning how to protect the device seems a sound investment.

The biggest issue however, is knowing to look for this in the first place. When I asked the client about these issues, the answer was predictable. We had no idea those interfaces were even there.