Not the end of malware

Monday, 13 Dec 2004

Dru Nelson talks about how we could actually put an end to viruses. He sketches out the obvious concept involving the much hyped NX bit and mandatory certification of all code.

I don’t buy it.

For starters, compulsory certificates are a nest full of snakes.

First of all, if certificates are centrally controlled, with high barriers to obtaining one, then the certificate authority can be abused as a tool for censorship. It is not DRM, just as Nelson reassures us – but suffers the same problems regardless. A rose by any other name… On the other hand, if the infrastructure is highly decentralized and certificates are easy to issue by anyone, how do we tell which issuer of certificates is trustworthy? There’s no trouble for a miscreant to create one for his malicious code: back to square one.

Secondly, it is a commonly unconsidered problem that the online certification revocation lists frequently called for in such concepts – and sure enough, outlined in Nelson’s article as well – are delicious, nigh irresistible targets for an attacker. Imagine that you own a software company. Somehow, a cracker manages to subvert whatever the means is by which certificates and certificate revocations are distributed. Not only can they sign their own malicious code with your key; they can also issue revocations for the certificates on all the applications you wrote. Without having to find an exploit in any of your code, the cracker has made your all of your customers’ computers refuse to execute any of your applications any longer.

Oops.

I’m sure your customers will not be amused. Obviously, it is not a smart move to create systems with intentional single points of failure.

Now you may wonder how an attacker would manage the feat of cracking such a single point of failure in a world where only trusted code can be executed, anyway. After all, NX bits prevent overflows from having any impact, forming foundation for the security provided by certificates. So what is wrong with this picture?

The problem is that buffer overflows are not the only way to inject code. The bad boys don’t have a single point of failure, so to say. Stack overflow attacks may be ubiquitous now, which is because finding such vulnerabilities is a highly formulaic process. An exploit can be created with mechanical examination of code on as low a level as its machine code, and requires very little knowledge of the victim application. However, this ubiquity doesn’t imply there are no other means of injecting foreign code into applications – in fact there are many of those. Some are plain old application specific bugs which take more effort to find and exploit because you actually have to read the source and find edge cases it doesn’t account for that may open an opportunity for code injection. But other types of vulnerabilities actually form classes much like the venerable buffer overflow, though their exploitation is harder to generalise. An already popular category would be integer overflows.

But NX is not even a silver bullet for buffer overflows. One might conceivably craft an overflow that leaves the entry point of a system call or application function in place of the return address, along with parameters for it, used to achieve a malicious intent. In this case, no foreign code is ever executed, the flow of control in the program is simply hijacked. This undoubtedly is a much harder attack than just putting your own executable code on the victim application’s stack, but it goes to show that buffer overflows remain a problem regardless of NX. It tickles my sense of irony to think about crafting such a return stack attack in a way that it ends up invoking a system call to allow execution on a piece of memory just overwritten by the attack, and then returning into it… To be fair, Dru Nelson does mention two-stack architectures like Forth machines where the return addresses live on a distinct stack, making stack buffer overflows an actual non-issue. I’ll get to more about that in a minute, actually.

At this point you might be wondering how that card house toppled so easily. Ironically, Nelson gave the reason himself but failed to see the forest for the trees. His own central argument is:

The previous model for handling security on the internet was to react to the threats. If you got anti-virus software, you had to keep it “up to date”. If there was a vulnerability in your OS, you had to check for updates from your OS vendor. All of these are symptomatic treatments rather than cures.

Here’s the secret: NX and certificates are just more of the same: symptomatic cures, albeit on a much higher level than seen so far.

The only possible way to build real security is an approach known as “deny by default”. The idea is to make it impossible to do something potentially harmful in the first place, only imposing exceptions where you need them to establish a functioning system. The two-stack approach mentioned in passing by Nelson follows this rule: it is impossible by design to overwrite crucial areas of memory. There are instructions available to copy between the stacks, but you need to be executing code to make use of them in the first place – impossible just by dumping strings on the data stack. Similarly, the Bell-LaPadula security model, revolving around the idea that it should never be possible to climb from a low trust level of operation to a higher one (only vice versa), shows how to apply “deny by default” to OS design. (I am aware that the Bell-LaPadula model is neither practically useful nor, in its basic form, free of contradictions. However, it is the pioneering formalization this policy concept and simplest possible form of it. Research has since been conducted to correct flaws in the initial model, though we’re far from done yet.)

In contrast, NX is a symptomatic cure for a system where code on the stack is executable by design, attempting to restrict it from being so by policy. Similarly, the proposed use of certificates is a symptomatic cure for systems where any code can climb trust levels by design, attempt to restrict it from being so by policy.

And that won’t work.