Thursday, July 16, 2009

2009 SANS Forensics Summit Recap: Day Two

In my previous post I recapped day one of the 2009 SANS Forensics Summit. In this post, I'll continue with coverage of day two, but first, I have to say that I did cut out for a few hours during day two to have lunch with my friend mubix from Room362.com so I apologize in advance for not being able to comment on things I didn't see.

Ovie Carroll, Director of the Cybercrime Lab at U.S. Department of Justice Computer Crime and Intellectual Property Section started off day two. Carroll is co-host of the Cyberspeak podcast and like Richard Bejtlich, Carroll gives a great presentation accompanied by an entertaining slide deck. One of the things I really liked about Carroll's presentation was that he took time to update it with information that had been presented the previous day. There weren't a bunch of updates, but it was nice to see that he thought content from the previous day was as valuable as I did and that he took the time to make the updates at all demonstrated how much he cared about the subject matter.

Carroll spoke about trends in and the future of forensics from a law enforcement perspective. One of the key take aways from Carroll's talk was that there is a mountain of work facing law enforcement and they are having difficulty keeping up. He mentioned that it was not uncommon for some agencies to have systems in their possession for 18 months before they get a look at them. Having worked for defense attorneys (prosecutors never call me) for a number of years now, I haven't seen delays quite that long, but I don't doubt it for some agencies.

Clearly there are a number of factors contributing to the delay. One is that law enforcement is interested in analyzing computers even in traditional crimes because they have found so much good evidence on people's hard drives. Two, there simply aren't enough people doing this work due to lack of qualified personnel and due to budget constraints the problem likely won't go away, ever. Lastly and no less importantly, there's just a ton of information being produced each year in this digital age. Carroll said that in 2008 more content was produced online than humanity produced in traditional forms (paper and ink) over the last 5000 years. Sure, not all of that data is relevant to case work, but some of it is and it takes time to analyze what's relevant.

Carroll has been advocating for a phased approach for a while now and he repeated the call during his talk. Law enforcement agencies should take a triage approach and try to build enough of a case without completely analyzing systems that they can get suspects to plea bargain and thus clear out some of the case load, at least for the more mild offenders. This is something I've told students as well, yes we want to analyze every piece of evidence that we collect, unless of course we can build a strong case without doing all that comprehensive work and short-circuit the process through a plea bargain.

One more thing about Carroll, he's funny. You want that in morning speaker.

Following Carrol, Chris Kelly Managing Attorney for the Cybercrime Division of Massachusetts' Attorney General's Office addressed the audience. Kelly had some great stories about some really stupid criminals, the ones who get caught, generally are, but one guy rose above the rest by snapping a picture of himself with someone's cell phone while he was in the act of robbing that someone's home. Good times.

Kelly started out talking about how much things have changed in the cybercrime world. We've gone from phone phreakers, defacements and obnoxious worms to organized criminal networks, terrorism and traditional crimes that involve computers as sources of evidence. As an example of the latter, consider a case in my area where a college professor was convicted of killing his ex-wife. One piece of evidence found on his home computer was search history about ways to kill people. He claimed he was doing research for a novel. Along these lines, Kelly brought up the case of Neil Entwistle who killed his wife and daughter. In his search history were queries about how to kill people.

Kelly also spoke about some of the training they are offering to law enforcement including the need for first responders to stop pulling the plug and to perform collection of volatile evidence. He played a hilarious clip from CSI of cell phone forensic analysis that had everyone in the room laughing.

At this point, I'm sorry to say, I had to cut out, but a user panel assembled to discuss aspects of forensics in law enforcement. The panel was to have included Carroll, Kelly, Andrew Bonillo, Special Agent/Computer Forensic Examiner at the U.S. Secret Service; Richard Brittson, retired detective, New York City Police Department; Jennifer Kolde, Computer Scientist with the FBI San Diego Division's National Security Cyber Squad; Cindy Murphy, detective, City of Madison, WI Police Department; Ken Privette, Special Agent in Charge of Digital Evidence Services, United States Postal Service Office of Inspector General; Paul J. Vitchock, Special Agent, Federal Bureau of Investigation, Washington Field Office; and Elizabeth Whitney, Forensic Computer Examiner, City-County Bureau of Identification, Raleigh, NC.

I apologize if I missed anyone on the list, because I missed the panel, I'm going off of the agenda so some of these folks may not have been there and others may have been on the panel in their place. I have looked over some of the presentations that were given and I'm sure I missed some great content and as someone who frequently works opposite law enforcement, I wish I could have caught this panel.

After lunch, Dr. Doug White Director of the FANS Lab at Roger Williams University spoke about several different topics related to forensics and the courtroom including some cases where admissability of evidence came into play. I got a little lost at one point while White was speaking about this. His slides referred to US. V. Richardson 583 F. Supp. 2d 694 (W.D. PA 2008) with the sub-bullet referring to hacker defense, but the only thing I can find about the case online indicates that there were scoping issues with a warrant rather than a hacker defense.

White brought up another interesting case, U. S. v. Carter 549 F. Supp. 2d (D. Nev. 2008), discussed here, where the IP address of a suspect's system was deemed circumstantial evidence and could not be used to tie an individual to the crime. Lesson for investigators, get as much supporting evidence as you can.

White talked about the Adam Walsh Act that limits defense attorneys and experts to "reasonable access" to the evidence in cases that involve the exploitation of children. Reasonable access generally means at the law enforcement agency during normal business hours. This is a well intentioned law that has cost me some business but if it prevents children from suffering at the hands of incompetent practitioners who lose hard drives or otherwise leak evidence, then it's a good thing.

One great recommendation White made was to spend downtime coming up with simple ways to explain complex topics. There are lots of things those of us in tech take for granted, like IP addressing and NAT, but when we have to explain them to non-technical folks it can be difficult. Spending time to write clear and easy to understand explanations that can be quickly added to the appendix of a report saves time. It's like developers reusing code.

Following White's talk Craig Ball, trial lawyer and forensic expert, touched on this same idea during his lightning talk as part of the user panel on challenges in the court room. Ball is a wonderful presenter. He struck me as a very intelligent, thoughtful and friendly gentleman (he's a lawyer?!). Ball had a great slide deck loaded with graphics including some simple animations that he uses to explain complex topics in simple ways to members of the jury, things like how a hard drive works.

Ball also mentioned using visualization software for turning timelines into nice looking charts. I believe he said he uses a product called Time Map, but a quick search reveals there are quite a few different products on the market. Check out Ball's website where he has loads of materials available for free. I would love to see Ball at work in the court room. I hope to catch him giving a longer presentation at some point in the future.

Also on the panel with Ball were White, Gary Kessler, Associate Professor of Computer and Digital Forensics and Director of the Center for Digital Investigations at Chamberlain College; Bret Padres Direcor of Digital Forensics for Stroz Friedberg and co-host of the Cyberspeak podcast and Larry Daniel, principal examiner for Guardian Digital Forensics. I may have missed someone, Dave Kleiman was on the agenda, but his slides aren't on the conference CD and I can't remember him being on the panel, this is not to say that if he was on the panel, he didn't have anything noteworthy to say. Rather, it's a reflection on my own poor memory and the fact that I'm writing this more than a week after the event.

Kessler's question was to rank the qualities in order of importance that an investigator should have and to explain his ranking. I liked Kessler's answer because he took the list given to him (analysis skills, acquisition skills, data recovery skills, report writing skills (Kessler expanded this to communication skills), law enforcement background, computer science background, problem solving skills, integrity and caution) and he added his own qualities of curiosity, technical astuteness and tenacity.

Kessler's overall number one choice was integrity, something I happen to agree with. And his least important qualities were a computer science background followed by a law enforcement background. For those of us in the field who lack a computer science degree and a law enforcement background, it's easy to agree with Kessler's putting those at the bottom of the list. His second most important quality was technical astuteness. Oddly enough, I know some folks with computer science degrees, who have difficulty with technology outside of their narrow field of specialization.

Kessler made a point about good examiners that I've heard repeated by others in the field. So much of the job is about being tenacious. Hard problems are hard and many times there are no quick wins, the examiner who sticks with it and works through the adversity is the one you want working for you.

At this point, I had to cut out and catch a flight. All in all, this was the greatest incident response and forensics focused conference I've attended. If you work in the field, you should try and attend next year, this is not a normal SANS event, it's really a single track conference bookended by the training that SANS if known for.

I hope to see you there next year.

Tuesday, July 14, 2009

2009 SANS Forensics Summit Recap: Day One


I had the great pleasure of attending and participating as a panelist in the 2009 SANS What Works Summit in Forensics and Incident Response. I covered my presentation in a previous post, but wanted to share my thoughts on some other aspects of this great event.

The Summit was a two day event, with day one focusing mostly on the more technical aspects of forensics and incident response. Day two's focus was more on the legal side of things, though Eoghan Casey of cmdlabs did give an excellent technical talk on mobile device forensics on day two.

Day one kicked off with a keynote address by Richard Bejtlich. This is the second year in a row Bejtlich has addressed the attendees and from what I gather, his talk this year was sort of a continuation of his talk from last year. If you read Bejtlich's blog, you know he's a critical thinker and has made some valuable contributions to the field. He knows how to put together an engaging talk and a good slide deck to go with it.

I did have a slight "uh oh" moment during Bejtlich's address when he mentioned his concept of creating a National Digital Security Board. I have been reading Bejtlich's blog for years, but apparently missed that entry entirely. The "uh oh" was because that was pretty closely related to the theme of my panel presentation. In a nutshell, my take was that incident responders ought to be much more open about what we're dealing with in the same way that the National Transportation Safety Board publishes over 2000 reports each year regarding transportation failures.

Following Bejtlich was an excellent talk by Kris Harms called "Evil or Not? Rapid Confirmation of Compromised Hosts Via Live Incident Response". I appreciated Harms' talk because it was basically a talk from the trenches, very nuts and bolts, covering a list of different techniques and tools that incident responders can use to quickly assess a potentially compromised system.

Harms covered many of the tools commonly used by incident responders, but I picked up a few new tactics. One of them was his use of Sysinternals Autoruns and it's capacity to sort out signed and unsigned code. Certainly a criminal could go to the trouble to create signed code and many companies produce legit code that's unsigned, but one of the things incident responders, like forensic examiners need to do is quickly reduce the size of the data to parse and this is one possible technique.

Harms also spoke about the Advanced Persistent Threat something we should all be giving more attention. APT's frequently make use of rootkit technologies to hide themselves. Harms gave some examples of using Handle, again a Sysinternals tool that most IR folks have used, but it was interesting to note that Handle could be used to work backwards from open files to process IDs and that usually rootkits aren't able to hide themselves from this backwards approach.

Harms talk alone would benefit any incident responder, but the Summit was just getting started. After Harms, a panel of incident responders took the stage, including Harlan Carvey, Harms, Chris Pogue and Ken Bradley and myself. Each member of the panel gave a lightning style talk answering a question of their choosing. The general consensus of the group was that the best tool for incident responders is still the gray matter between one's ears. Following the panelist's presentations, members of the audience had a chance to ask questions. This format was followed for all the panels during the Summit. It's a great opportunity for practitioners to pick the brains of leading experts.

Following lunch, Carvey took the stage again and talked registry forensics. When it comes to the Windows registry, Carvey has done more for the Windows IR and forensics community than any other individual. If you are an incident responder or digital investigator and haven't picked up a copy of his book you really should purchase a copy or watch the SANS Forensics Blog where we'll be giving away a few copies courtesy of Syngress Publishing. Carvey has written some great tools for pulling useful information from the registry and has made them freely available from his web site. One thing he said during his talk and that is repeated in his book is that the Windows registry is a log file. Given the fact that keys have last write time stamps, this is true and can be very useful for making a case. Carvey's a great speaker, if you have a chance to see him talk, don't pass it up.

Following Carvey was another panel discussion on essential forensics tools and techniques. Jesse Kornblum spoke about dealing with foreign languages in malware. Kornblum has an amazing mind and has made many great contributions to the field. Hearing him speak was another among the many highlights of the Summit. Troy Larson answered the question, "What forensic tool needs to be created that doesn't exist yet?" His answer was "a tool to perform intelligent network imaging of volume shadow copies." If you don't know, volume shadow copies are bit-level diffs of all the clusters on your Windows Vista and later volumes. Obviously, there's a wealth of useful data in there, but as of yet, getting at the data is a labor intensive process and sadly many practitioners don't even bother.

Also on the panel was Mark McKinnon of RedWolf Computer Forensics, author of numerous forensics tools including Skype Log Parser, CSC Parser (for offline files) as well as a number of parsers for a variety of browsers. McKinnon answered the question "What are 2-3 major challenges that investigators now face or will face in the near future?" His answer was the astounding amount of new software and hardware that is flooding the market including the latest smart phones, gaming consoles, Google Wave, etc.

Jess Garcia from One eSecurity spoke about using different tools and different approaches depending on the type of case being worked. On one of his slides he mentioned cases involving cloud providers. I can just imagine the headaches that's going to present in the future.

At the end of the panel, Rob Lee asked the panelists what their favorite forensics tools was or what they used most often and I believe everyone of them said X-Ways Forensics and WinHex.

Next, Jamie Butler and Peter Silberman from Mandiant spoke about memory forensics and ran through some demos. On the day of their talk, they also released new versions of Memoryze and Audit Viewer. These two are whip smart and it was great to see their work in action.

The writing has been on the wall for a few years now that collecting memory dumps could replace a bunch of more traditional live response steps and with the advances that these tools bring, there should no longer be any doubt that collecting memory should be the first step in any incident response. There are bits of information you can get from memory that you can't get from any other tools. One of these is time stamps for socket connections. To say nothing of memory resident malware. Memory analysis is the future and the future is here now (though it may not be evenly distributed, as has been said).

Even if you're dealing with a system that doesn't currently have good analysis tools available for its memory dumps, don't underestimate the ability of geniuses like Butler and Silberman to create tools that may one day help your case and in the meantime, there's still scads of information you can glean from a simple strings search.

Following Butler and Silberman, Brendan Dolan-Gavitt a post-grad at Georgia Tech and a contributor to Volatility talked about and demoed some of his work parsing registry structures from memory dumps.

At that point, my brain was pretty full so I checked out for a bit and went to dinner, but made it back in time to catch the live recording of Cyberspeak. It was fun to watch the show and there was some great discussion between Ovie Carroll Larson and Craig Ball. I wish the members of the audience participating in the discussion could have been mic'd because there were lots of smart comments.

All in all, it was an amazing day. This was only my second time being in Washington D.C., my other visit being for Shmoocon and I considered cutting out to go do some sight-seeing, until I got there and realized there was going to be some world class content that no one in their right mind would want to miss.

I know it's the intention of the organizers to post as much of the presentations as possible, but as of this writing the files aren't available. Watch the SANS Forensics Blog for an announcement once the presentations are posted.

I'll post my day two recap in the next few days.

Tuesday, July 7, 2009

SANS Forensics Summit

Rob Lee invited me to participate on the Incident Response panel at the SANS Forensics Summit. The panel consisted of some very well known and well respected experts in the field like Harlan Carvey, Kris Harms, Chris Pogue and Ken Bradley. Needless to say, it was a real privilege for me to be on the panel with these guys.

As panel members, we were each tasked with answering a question about incident response. Rob raised the questions, but gave us the option to answer a question of our own choosing based on the theme of the Summit. In a nutshell, the theme of the Summit was that given all the great advances that have been made in the incident response and forensics, what are the new essential techniques, tools and/or methods that incident handlers and forensic investigators should be using in their work.

Never one to take what's given to me without twisting it a bit, I took liberties with the theme in an effort to convey what I think is one of the powerful new ideas in information security based on Adam Shostack's and Andrew Stewart's book, The New School of Information Security.

My question then was basically this: Given that incident response has advanced greatly over the last decade, largely due to necessity because information security operations is pretty bad, as evidence consider T.J. Maxx's loss of 94 million credit card numbers in 2007 or T-Mobile's loss of 17 million records in 2008 or the untold millions of records lost by Heartland Payment Systems and of course the countless smaller failures each year that don't get much attention. Given all of that, what should incident handlers be doing to help improve information security operations overall?

I had five minutes to answer this question. That's not much time and some of my argument was lost due to time so I wanted to publish the slides for my talk here so folks could download them and take a look at the presenter notes and hopefully get a feel for where I was coming from.

As for my answer, as I mentioned in my talk, I strongly believe that information security is like that person who lost their keys on a darkened street and was searching for them when a stranger came by and offered to help. After several minutes of looking and finding nothing, the stranger asked, "Are you sure you lost them here?" And the person responded, actually, I lost them up the street, but the light is better here.

Too many information security operations teams are spending valuable cycles on the wrong things and it's not necessarily their fault. If you believe Shostack and Stewart, it's because we don't have adequate data to approach our tasks in a more scientific way. Info sec is currently being practiced as more of an art than a science and until we start gathering good metrics about failures, we may continue to focus on the wrong things.

So, I put the charge out for incident response firms to be more open about the failures they are seeing and to follow Verizon's lead and in fact, exceed it. We need more details about some of the most sophisticated and successful attacks. We need to know exactly how our defenses are failing. Data breach notification laws are well and good, but they generally give us very little insight into what went wrong.

And with that intro, here's the presentation. You'll want to open it in a new window and click the option to view the presenter notes on the Actions menu, otherwise it's mostly old photos:

Paperclip Maximizers, Artificial Intelligence and Natural Stupidity

Existential risk from AI Some believe an existential risk accompanies the development or emergence of artificial general intelligence (AGI)...