30 April, 2008

NoVASec: Memory Forensics

Richard Bejtlich arranged a NoVASec meeting on memory forensics for Thursday, April 24. Aaron Walters of Volatile Systems was the scheduled speaker. George Garner of GMG Systems, Inc., also showed up, so we were lucky enough to get two speakers for the price of one. (If you aren't aware, NoVASec is actually free). Aaron primarily talked about performing forensics and analysis on memory dumps, and afterwards Richard asked George to come up from the audience and talk about the challenges of actually acquiring the memory dumps.

Both Aaron and George were very knowledgeable and had a lot of interesting things to discuss. In fact, most of us didn't leave until after 22:00 so there was a good two and a half hours of technical discussion. It wouldn't do them justice for me to try and recap their talks, but I will mention a couple brief thoughts I jotted down while listening. If I'm getting anything wrong here, someone please pipe up and let me know.

First is that I saw some parallels between points mentioned by Aaron and Network Security Monitoring. Aaron stated that a live response on a system requires some trust of the system's operating system, is obtrusive, and is unverifiable. Dumping the RAM and performing an analysis using a trusted system helps mitigate these problems though I don't think he meant it solves them completely. Similarly, in NSM we use information that is gathered by our most trustworthy systems, network sensors that allow limited access, rather than trusting what we find on the host. In forensics and NSM, steps are taken to increase the trustworthiness and verifiability of information that is gathered.

Second, Aaron and George both seemed to agree that acquiring memory contents is not easy. Not only can it be difficult, but even a successful acquisition has issues. George pointed out that if you don't isolate the system, an attacker could be altering the system or memory as you acquire it. He also pointed out that dumping the memory is actually sampling, not an image, because the RAM contents are always changing even on a system that has been isolated from the network. One memory dump is just one sample of what resided in memory at a given time. More evidence and more sampling will increase the reliability of the evidence attained. Also, gathering evidence from multiple sources, for instance hard drive forensics, memory forensics and NSM, increases the probability evidence will be accurate and verifiable.

There was also some discussion of PCI and video devices as they relate to both exploiting systems and memory forensics. Acquiring memory can be an issue on systems using PAE since reading from the space used by PCI devices can crash the system. On the exploit side, the GPU and RAM on video cards can be used to help facilitate attacks, as can certain PCI devices. There is a lot of interesting work going on in this field, and George even mentioned that he has been working on tools for acquiring the contents of memory from video cards.

It was an excellent meeting.

27 April, 2008

Defcon 16 Race to Zero

There have been articles about Defcon's Race to Zero since it was announced. I first read about it on the Daily Dave mailing list when the announcement was posted a couple days ago on 27 April. Apparently, some vendors and media are unhappy and criticizing the competition. While this is not surprising, it strikes me as pointless to complain about a competition that is just demonstrating what can be and already is done in the wild.

From the Race to Zero site:

The event involves contestants being given a sample set of viruses and malcode to modify and upload through the contest portal. The portal passes the modified samples through a number of antivirus engines and determines if the sample is a known threat. The first team or individual to pass their sample past all antivirus engines undetected wins that round. Each round increases in complexity as the contest progresses.
Anyone that has submitted real malware samples to a service like VirusTotal already knows how pitiful and inconsistent anti-virus software is at detecting malware, particularly if it is new or newly modified. There is a reason we see so many variants of the same malware, and it's not because anti-virus is so effective that malware authors have to completely rewrite their code.
  1. Reverse engineering and code analysis is fun.
  2. Not all antivirus is equal, some products are far easier to circumvent than others. Poorly performing antivirus vendors should be called out.
  3. The majority of the signature-based antivirus products can be easily circumvented with a minimal amount of effort.
  4. The time taken to modify a piece of known malware to circumvent a good proportion of scanners is disproportionate to the costs of antivirus protection and the losses resulting from the trust placed in it.
  5. Signature-based antivirus is dead, people need to look to heuristic, statistical and behaviour based techniques to identify emerging threats
  6. Antivirus is just part of the larger picture, you need to look at controlling your endpoint devcies [sic] with patching, firewalling and sound security policies to remain virus free.
Although I have very limited and basic experience reverse engineering malware, it does seem fun and interesting. I also totally agree that vendors need to be called out.

Heuristic, statistical and behavior-based techniques may indeed help, but point number six seems equally important. I don't really know what the best solution is, but hopefully some vendors will eventually realize that their methods and models need to change to become more proactive instead of reactive.

09 April, 2008

PADS signatures, NSMWiki, OpenPacket

I added a few PADS signatures to the NSMWiki. Anyone else that has some should definitely contribute since the standard signature set is fairly small and has a huge potential for improvement. I'm sure that any other useful contributions to NSMWiki are also appreciated.

Richard Bejtlich posted about OpenPacket being online. I think the idea is great and there is a strong community of people that have signed on to help him with various aspects of the site.

OpenPacket.org is a Web site whose mission is to provide a centralized repository of network traffic traces for researchers, analysts, and other members of the digital security community.
For anyone just starting out in digital security or looking to get into the field, I strongly encourage you to participate in the security community as a whole. The number of ways to participate are too numerous for me to list, but there is definitely a lot to be learned from others who are more experienced, less experienced, or just have different types of experience. Just reading blogs, news, mailing lists and other sites can be enlightening, and once you get your feet wet you may find yourself contributing in short order.