Thursday, July 16, 2009

2009 SANS Forensics Summit Recap: Day Two

In my previous post I recapped day one of the 2009 SANS Forensics Summit. In this post, I'll continue with coverage of day two, but first, I have to say that I did cut out for a few hours during day two to have lunch with my friend mubix from Room362.com so I apologize in advance for not being able to comment on things I didn't see.

Ovie Carroll, Director of the Cybercrime Lab at U.S. Department of Justice Computer Crime and Intellectual Property Section started off day two. Carroll is co-host of the Cyberspeak podcast and like Richard Bejtlich, Carroll gives a great presentation accompanied by an entertaining slide deck. One of the things I really liked about Carroll's presentation was that he took time to update it with information that had been presented the previous day. There weren't a bunch of updates, but it was nice to see that he thought content from the previous day was as valuable as I did and that he took the time to make the updates at all demonstrated how much he cared about the subject matter.

Carroll spoke about trends in and the future of forensics from a law enforcement perspective. One of the key take aways from Carroll's talk was that there is a mountain of work facing law enforcement and they are having difficulty keeping up. He mentioned that it was not uncommon for some agencies to have systems in their possession for 18 months before they get a look at them. Having worked for defense attorneys (prosecutors never call me) for a number of years now, I haven't seen delays quite that long, but I don't doubt it for some agencies.

Clearly there are a number of factors contributing to the delay. One is that law enforcement is interested in analyzing computers even in traditional crimes because they have found so much good evidence on people's hard drives. Two, there simply aren't enough people doing this work due to lack of qualified personnel and due to budget constraints the problem likely won't go away, ever. Lastly and no less importantly, there's just a ton of information being produced each year in this digital age. Carroll said that in 2008 more content was produced online than humanity produced in traditional forms (paper and ink) over the last 5000 years. Sure, not all of that data is relevant to case work, but some of it is and it takes time to analyze what's relevant.

Carroll has been advocating for a phased approach for a while now and he repeated the call during his talk. Law enforcement agencies should take a triage approach and try to build enough of a case without completely analyzing systems that they can get suspects to plea bargain and thus clear out some of the case load, at least for the more mild offenders. This is something I've told students as well, yes we want to analyze every piece of evidence that we collect, unless of course we can build a strong case without doing all that comprehensive work and short-circuit the process through a plea bargain.

One more thing about Carroll, he's funny. You want that in morning speaker.

Following Carrol, Chris Kelly Managing Attorney for the Cybercrime Division of Massachusetts' Attorney General's Office addressed the audience. Kelly had some great stories about some really stupid criminals, the ones who get caught, generally are, but one guy rose above the rest by snapping a picture of himself with someone's cell phone while he was in the act of robbing that someone's home. Good times.

Kelly started out talking about how much things have changed in the cybercrime world. We've gone from phone phreakers, defacements and obnoxious worms to organized criminal networks, terrorism and traditional crimes that involve computers as sources of evidence. As an example of the latter, consider a case in my area where a college professor was convicted of killing his ex-wife. One piece of evidence found on his home computer was search history about ways to kill people. He claimed he was doing research for a novel. Along these lines, Kelly brought up the case of Neil Entwistle who killed his wife and daughter. In his search history were queries about how to kill people.

Kelly also spoke about some of the training they are offering to law enforcement including the need for first responders to stop pulling the plug and to perform collection of volatile evidence. He played a hilarious clip from CSI of cell phone forensic analysis that had everyone in the room laughing.

At this point, I'm sorry to say, I had to cut out, but a user panel assembled to discuss aspects of forensics in law enforcement. The panel was to have included Carroll, Kelly, Andrew Bonillo, Special Agent/Computer Forensic Examiner at the U.S. Secret Service; Richard Brittson, retired detective, New York City Police Department; Jennifer Kolde, Computer Scientist with the FBI San Diego Division's National Security Cyber Squad; Cindy Murphy, detective, City of Madison, WI Police Department; Ken Privette, Special Agent in Charge of Digital Evidence Services, United States Postal Service Office of Inspector General; Paul J. Vitchock, Special Agent, Federal Bureau of Investigation, Washington Field Office; and Elizabeth Whitney, Forensic Computer Examiner, City-County Bureau of Identification, Raleigh, NC.

I apologize if I missed anyone on the list, because I missed the panel, I'm going off of the agenda so some of these folks may not have been there and others may have been on the panel in their place. I have looked over some of the presentations that were given and I'm sure I missed some great content and as someone who frequently works opposite law enforcement, I wish I could have caught this panel.

After lunch, Dr. Doug White Director of the FANS Lab at Roger Williams University spoke about several different topics related to forensics and the courtroom including some cases where admissability of evidence came into play. I got a little lost at one point while White was speaking about this. His slides referred to US. V. Richardson 583 F. Supp. 2d 694 (W.D. PA 2008) with the sub-bullet referring to hacker defense, but the only thing I can find about the case online indicates that there were scoping issues with a warrant rather than a hacker defense.

White brought up another interesting case, U. S. v. Carter 549 F. Supp. 2d (D. Nev. 2008), discussed here, where the IP address of a suspect's system was deemed circumstantial evidence and could not be used to tie an individual to the crime. Lesson for investigators, get as much supporting evidence as you can.

White talked about the Adam Walsh Act that limits defense attorneys and experts to "reasonable access" to the evidence in cases that involve the exploitation of children. Reasonable access generally means at the law enforcement agency during normal business hours. This is a well intentioned law that has cost me some business but if it prevents children from suffering at the hands of incompetent practitioners who lose hard drives or otherwise leak evidence, then it's a good thing.

One great recommendation White made was to spend downtime coming up with simple ways to explain complex topics. There are lots of things those of us in tech take for granted, like IP addressing and NAT, but when we have to explain them to non-technical folks it can be difficult. Spending time to write clear and easy to understand explanations that can be quickly added to the appendix of a report saves time. It's like developers reusing code.

Following White's talk Craig Ball, trial lawyer and forensic expert, touched on this same idea during his lightning talk as part of the user panel on challenges in the court room. Ball is a wonderful presenter. He struck me as a very intelligent, thoughtful and friendly gentleman (he's a lawyer?!). Ball had a great slide deck loaded with graphics including some simple animations that he uses to explain complex topics in simple ways to members of the jury, things like how a hard drive works.

Ball also mentioned using visualization software for turning timelines into nice looking charts. I believe he said he uses a product called Time Map, but a quick search reveals there are quite a few different products on the market. Check out Ball's website where he has loads of materials available for free. I would love to see Ball at work in the court room. I hope to catch him giving a longer presentation at some point in the future.

Also on the panel with Ball were White, Gary Kessler, Associate Professor of Computer and Digital Forensics and Director of the Center for Digital Investigations at Chamberlain College; Bret Padres Direcor of Digital Forensics for Stroz Friedberg and co-host of the Cyberspeak podcast and Larry Daniel, principal examiner for Guardian Digital Forensics. I may have missed someone, Dave Kleiman was on the agenda, but his slides aren't on the conference CD and I can't remember him being on the panel, this is not to say that if he was on the panel, he didn't have anything noteworthy to say. Rather, it's a reflection on my own poor memory and the fact that I'm writing this more than a week after the event.

Kessler's question was to rank the qualities in order of importance that an investigator should have and to explain his ranking. I liked Kessler's answer because he took the list given to him (analysis skills, acquisition skills, data recovery skills, report writing skills (Kessler expanded this to communication skills), law enforcement background, computer science background, problem solving skills, integrity and caution) and he added his own qualities of curiosity, technical astuteness and tenacity.

Kessler's overall number one choice was integrity, something I happen to agree with. And his least important qualities were a computer science background followed by a law enforcement background. For those of us in the field who lack a computer science degree and a law enforcement background, it's easy to agree with Kessler's putting those at the bottom of the list. His second most important quality was technical astuteness. Oddly enough, I know some folks with computer science degrees, who have difficulty with technology outside of their narrow field of specialization.

Kessler made a point about good examiners that I've heard repeated by others in the field. So much of the job is about being tenacious. Hard problems are hard and many times there are no quick wins, the examiner who sticks with it and works through the adversity is the one you want working for you.

At this point, I had to cut out and catch a flight. All in all, this was the greatest incident response and forensics focused conference I've attended. If you work in the field, you should try and attend next year, this is not a normal SANS event, it's really a single track conference bookended by the training that SANS if known for.

I hope to see you there next year.

Tuesday, July 14, 2009

2009 SANS Forensics Summit Recap: Day One


I had the great pleasure of attending and participating as a panelist in the 2009 SANS What Works Summit in Forensics and Incident Response. I covered my presentation in a previous post, but wanted to share my thoughts on some other aspects of this great event.

The Summit was a two day event, with day one focusing mostly on the more technical aspects of forensics and incident response. Day two's focus was more on the legal side of things, though Eoghan Casey of cmdlabs did give an excellent technical talk on mobile device forensics on day two.

Day one kicked off with a keynote address by Richard Bejtlich. This is the second year in a row Bejtlich has addressed the attendees and from what I gather, his talk this year was sort of a continuation of his talk from last year. If you read Bejtlich's blog, you know he's a critical thinker and has made some valuable contributions to the field. He knows how to put together an engaging talk and a good slide deck to go with it.

I did have a slight "uh oh" moment during Bejtlich's address when he mentioned his concept of creating a National Digital Security Board. I have been reading Bejtlich's blog for years, but apparently missed that entry entirely. The "uh oh" was because that was pretty closely related to the theme of my panel presentation. In a nutshell, my take was that incident responders ought to be much more open about what we're dealing with in the same way that the National Transportation Safety Board publishes over 2000 reports each year regarding transportation failures.

Following Bejtlich was an excellent talk by Kris Harms called "Evil or Not? Rapid Confirmation of Compromised Hosts Via Live Incident Response". I appreciated Harms' talk because it was basically a talk from the trenches, very nuts and bolts, covering a list of different techniques and tools that incident responders can use to quickly assess a potentially compromised system.

Harms covered many of the tools commonly used by incident responders, but I picked up a few new tactics. One of them was his use of Sysinternals Autoruns and it's capacity to sort out signed and unsigned code. Certainly a criminal could go to the trouble to create signed code and many companies produce legit code that's unsigned, but one of the things incident responders, like forensic examiners need to do is quickly reduce the size of the data to parse and this is one possible technique.

Harms also spoke about the Advanced Persistent Threat something we should all be giving more attention. APT's frequently make use of rootkit technologies to hide themselves. Harms gave some examples of using Handle, again a Sysinternals tool that most IR folks have used, but it was interesting to note that Handle could be used to work backwards from open files to process IDs and that usually rootkits aren't able to hide themselves from this backwards approach.

Harms talk alone would benefit any incident responder, but the Summit was just getting started. After Harms, a panel of incident responders took the stage, including Harlan Carvey, Harms, Chris Pogue and Ken Bradley and myself. Each member of the panel gave a lightning style talk answering a question of their choosing. The general consensus of the group was that the best tool for incident responders is still the gray matter between one's ears. Following the panelist's presentations, members of the audience had a chance to ask questions. This format was followed for all the panels during the Summit. It's a great opportunity for practitioners to pick the brains of leading experts.

Following lunch, Carvey took the stage again and talked registry forensics. When it comes to the Windows registry, Carvey has done more for the Windows IR and forensics community than any other individual. If you are an incident responder or digital investigator and haven't picked up a copy of his book you really should purchase a copy or watch the SANS Forensics Blog where we'll be giving away a few copies courtesy of Syngress Publishing. Carvey has written some great tools for pulling useful information from the registry and has made them freely available from his web site. One thing he said during his talk and that is repeated in his book is that the Windows registry is a log file. Given the fact that keys have last write time stamps, this is true and can be very useful for making a case. Carvey's a great speaker, if you have a chance to see him talk, don't pass it up.

Following Carvey was another panel discussion on essential forensics tools and techniques. Jesse Kornblum spoke about dealing with foreign languages in malware. Kornblum has an amazing mind and has made many great contributions to the field. Hearing him speak was another among the many highlights of the Summit. Troy Larson answered the question, "What forensic tool needs to be created that doesn't exist yet?" His answer was "a tool to perform intelligent network imaging of volume shadow copies." If you don't know, volume shadow copies are bit-level diffs of all the clusters on your Windows Vista and later volumes. Obviously, there's a wealth of useful data in there, but as of yet, getting at the data is a labor intensive process and sadly many practitioners don't even bother.

Also on the panel was Mark McKinnon of RedWolf Computer Forensics, author of numerous forensics tools including Skype Log Parser, CSC Parser (for offline files) as well as a number of parsers for a variety of browsers. McKinnon answered the question "What are 2-3 major challenges that investigators now face or will face in the near future?" His answer was the astounding amount of new software and hardware that is flooding the market including the latest smart phones, gaming consoles, Google Wave, etc.

Jess Garcia from One eSecurity spoke about using different tools and different approaches depending on the type of case being worked. On one of his slides he mentioned cases involving cloud providers. I can just imagine the headaches that's going to present in the future.

At the end of the panel, Rob Lee asked the panelists what their favorite forensics tools was or what they used most often and I believe everyone of them said X-Ways Forensics and WinHex.

Next, Jamie Butler and Peter Silberman from Mandiant spoke about memory forensics and ran through some demos. On the day of their talk, they also released new versions of Memoryze and Audit Viewer. These two are whip smart and it was great to see their work in action.

The writing has been on the wall for a few years now that collecting memory dumps could replace a bunch of more traditional live response steps and with the advances that these tools bring, there should no longer be any doubt that collecting memory should be the first step in any incident response. There are bits of information you can get from memory that you can't get from any other tools. One of these is time stamps for socket connections. To say nothing of memory resident malware. Memory analysis is the future and the future is here now (though it may not be evenly distributed, as has been said).

Even if you're dealing with a system that doesn't currently have good analysis tools available for its memory dumps, don't underestimate the ability of geniuses like Butler and Silberman to create tools that may one day help your case and in the meantime, there's still scads of information you can glean from a simple strings search.

Following Butler and Silberman, Brendan Dolan-Gavitt a post-grad at Georgia Tech and a contributor to Volatility talked about and demoed some of his work parsing registry structures from memory dumps.

At that point, my brain was pretty full so I checked out for a bit and went to dinner, but made it back in time to catch the live recording of Cyberspeak. It was fun to watch the show and there was some great discussion between Ovie Carroll Larson and Craig Ball. I wish the members of the audience participating in the discussion could have been mic'd because there were lots of smart comments.

All in all, it was an amazing day. This was only my second time being in Washington D.C., my other visit being for Shmoocon and I considered cutting out to go do some sight-seeing, until I got there and realized there was going to be some world class content that no one in their right mind would want to miss.

I know it's the intention of the organizers to post as much of the presentations as possible, but as of this writing the files aren't available. Watch the SANS Forensics Blog for an announcement once the presentations are posted.

I'll post my day two recap in the next few days.

Tuesday, July 7, 2009

SANS Forensics Summit

Rob Lee invited me to participate on the Incident Response panel at the SANS Forensics Summit. The panel consisted of some very well known and well respected experts in the field like Harlan Carvey, Kris Harms, Chris Pogue and Ken Bradley. Needless to say, it was a real privilege for me to be on the panel with these guys.

As panel members, we were each tasked with answering a question about incident response. Rob raised the questions, but gave us the option to answer a question of our own choosing based on the theme of the Summit. In a nutshell, the theme of the Summit was that given all the great advances that have been made in the incident response and forensics, what are the new essential techniques, tools and/or methods that incident handlers and forensic investigators should be using in their work.

Never one to take what's given to me without twisting it a bit, I took liberties with the theme in an effort to convey what I think is one of the powerful new ideas in information security based on Adam Shostack's and Andrew Stewart's book, The New School of Information Security.

My question then was basically this: Given that incident response has advanced greatly over the last decade, largely due to necessity because information security operations is pretty bad, as evidence consider T.J. Maxx's loss of 94 million credit card numbers in 2007 or T-Mobile's loss of 17 million records in 2008 or the untold millions of records lost by Heartland Payment Systems and of course the countless smaller failures each year that don't get much attention. Given all of that, what should incident handlers be doing to help improve information security operations overall?

I had five minutes to answer this question. That's not much time and some of my argument was lost due to time so I wanted to publish the slides for my talk here so folks could download them and take a look at the presenter notes and hopefully get a feel for where I was coming from.

As for my answer, as I mentioned in my talk, I strongly believe that information security is like that person who lost their keys on a darkened street and was searching for them when a stranger came by and offered to help. After several minutes of looking and finding nothing, the stranger asked, "Are you sure you lost them here?" And the person responded, actually, I lost them up the street, but the light is better here.

Too many information security operations teams are spending valuable cycles on the wrong things and it's not necessarily their fault. If you believe Shostack and Stewart, it's because we don't have adequate data to approach our tasks in a more scientific way. Info sec is currently being practiced as more of an art than a science and until we start gathering good metrics about failures, we may continue to focus on the wrong things.

So, I put the charge out for incident response firms to be more open about the failures they are seeing and to follow Verizon's lead and in fact, exceed it. We need more details about some of the most sophisticated and successful attacks. We need to know exactly how our defenses are failing. Data breach notification laws are well and good, but they generally give us very little insight into what went wrong.

And with that intro, here's the presentation. You'll want to open it in a new window and click the option to view the presenter notes on the Actions menu, otherwise it's mostly old photos:

Tuesday, June 30, 2009

How quickly we forget

Date: Thu, 8 Jan 2009 08:26:45 -0800 (PST)
From: Rob Lee
Reply-To: Rob Lee
Subject: Re: [GCFA] Compiling evidence boils down to a matter of time
To: Dave Hull
Cc: GCFA
MIME-Version: 1.0
Content-Type: multipart/alternative; boundary="0-496094974-1231432005=:13648"
Message-ID: <1401.13648.qm@web42107.mail.mud.yahoo.com>

--0-496094974-1231432005=:13648
Content-Type: text/plain; charset=us-ascii

Done. That takes care of Windows 7 and Window Server 2008. Can you verify it can adjust all four timestamps or just a few of them? We can then add that to our list of known default programs. Also, can you document how it is used and what traces are left in its use?

What type of beer do you like and what is the next SANS conference you will be at?

--Rob


______________________________________________________________________________________________





________________________________
From: Dave Hull
To: Rob Lee
Cc: GCFA
Sent: Thursday, January 8, 2009 11:11:17 AM
Subject: Re: [GCFA] [HTCC] Compiling evidence boils down to a matter of time

Interesting thread. Windows 7 and Windows Server 2008 ship with
Powershell. Powershell can be used to modify timestamps. See this
entry on my blog for more info:

http://trustedsignal.blogspot.com/search/label/timestamps

Where's my six pack? ;)

--
Dave Hull
Trusted Signal
CISSP, GCFA, GCIH, GREM, SSP-MPA, CHFI
Public key: http://trustedsignal.com/pubkey.txt
Fingerprint: 4B2B F3AD A9C2 B4E1 CBDF B86F D360 D00F C18D C71B

"Great minds discuss ideas; Average minds discuss events; Small minds
discuss people." -- Eleanor Roosevelt

--0-496094974-1231432005=:13648

Wednesday, June 24, 2009

From New School of Information Security to Incident Response

The SANS Forensics and Incident Response Summit is just around the corner. Judging by the agenda it's going to be the best event for forensics and IR professionals for 2009.

Of course, I'm biased. Rob Lee, SANS' lead author for the forensics track invited me to be a panelist for the Summit several months ago. He posted a list of questions that we should be prepared to answer during the incident response panel and gave us the option to come up with our own question based on the Summit's theme.

In a nutshell, the theme of the Summit is that over the last decade forensics and incident response have advanced greatly due to new tools and techniques. What are the new essential tools and methods that incident responders must have or use.

Again, I'm paraphrasing the theme.

From there, I'll be jumping back 100 years, to look at a then emerging high tech field and some highlights (or rather low points) from it's first 50 to 60 years. To see what lessons it might offer us and how those lessons relate to Adam Shostack's and Andrew Stewart's book, The New School of Information Security. Oh, and I've got five minutes to do it so I'm gonna talk fast.

Aside from those five minutes, the Summit is going to be filled with legends in the field(s) and I'm really looking forward to hearing what they have to say.

The Summit is in two weeks and it's going to be amazing. Here's the registration link. Come and join us.

Tuesday, May 12, 2009

Teaching SANS 508 in San Antonio

If you're in the San Antonio area and have a need for forensics and incident response training, checkout the SANS Security 508: Computer Forensics Investigation and Response course that I will be teaching during the first week of June.

I have been working incident response and computer forensics for about five years. It is an exciting and challenging field that changes rapidly. I have been reviewing Rob Lee's latest course revision and am amazed at the amount of new material in the course since I taught it last summer.

If you're not in San Antonio, but have a desire to take this training, SANS has launched a "Forensics Tour" that will be bringing 508 to several locations over the coming year. Check the Community Events link over at the forensics.sans.org site for more details.

If you're in San Antonio and want to meetup while I'm there, leave a comment. I'm going to try and attend the San Antonio Hackers gathering the week that I'm there.

Tuesday, April 21, 2009

Application Security Checklist

After the 2009 CWE/SANS Top 25 Most Dangerous Programming Errors was released, I started adapting it into a checklist that developers could use during the software development life cycle to facilitate the development of more secure code.

I had reviewed the Top 25 document when it was first released and thought it was pretty good, however, after really diving into it while creating the checklist, I came to appreciate it even more. It's more than a list of the top 25 errors, the document includes guidance for different phases of the software development life cycle, from the design phase through implementation and testing. Some of the entries include code samples clearly demonstrating the errors. It's very comprehensive and any developer would be well served by studying it.

In my seven years as a full time developer, I never worked in an environment that relied on checklists, so I did some searching and happened across an excellent blog post by Brian St. Pierre on the subject.

When creating a checklist, it's important that it be structured such that questions are "yes/no" with affirmative answers indicating a secure situation. Negative answers indicate a problem. This permits reviewers to quickly scan the list for negative answers and follow up on only those issues.

Of course checklists don't guarantee secure software. Developers can lie, or misunderstand, etc. But I've seen firsthand there is value in having developers put their names on a checklist and go through it. Obviously the hope is that they will fully understand the issues and answer truthfully, but just by reading the document, they may learn about the issues and improve their code.

With that, here's the checklist. Please let me know what you think and feel free to use it as you see fit. I owe special thanks to Steve Christey of MITRE for allowing this derivative work.

Sunday, April 12, 2009

Fuzzy Wuzzy WebScarab

I've spent the better part of the last year working on application security issues, including a major effort to implement a secure development life-cycle. I split my days between reviewing source code, application pen testing and developing materials for the SDL initiative.

My primary tools are WebScarab, Burp Suite, RatProxy, a commercial static code analysis tool, a variety of fuzzers, InGuardians' Samurai, Cygwin and some other minor tools.

WebScarab includes a fuzzer plugin that accepts regular expressions or file input for fuzz values and allows you to specify the inputs that you want to fuzz. As an example, let's look at how this works against OWASP's WebGoat, a deliberately vulnerable web application designed for training purposes.

You can use the fuzzer in WebScarab to fuzz POST or GET request parameters. In this example, we'll look at fuzzing a simple POST request. The following screenshot shows the page in WebGoat that we'll be using for this example:

This page has a web form that takes a single search term as input. Nevermind the instructions on the WebGoat page, we're not working the exercise for this example, I'm merely using it to demonstrate WebScarab's fuzzer plugin.

When we enter a search term and submit it to the web server, we can view the request that is sent to the server using WebScarab as shown below:

The top portion of the WebScarab screen shows our request. In the middle pane you can see the search parameter, called "Username" for some reason is being submitted with a value of "lesson".

From the Summary tab in WebScarab, we can right-click on this particular POST request and select "Use as fuzz template" as you can see in the next screenshot:


Once you've selected the "Use as fuzz template" option, switch over to the Fuzzer tab in WebScarab. In the Fuzzer tab, you'll see you can fuzz more than just the POST parameters. Take a look at the next screen shot to see what I'm talking about:

Notice we can fuzz cookies, GET and POST parameters. For this example, we're going to try the Username parameter. The first thing to do is pick our fuzzing Source. We do this by clicking the "Sources" button which brings up the following window:

Notice you can select a file, or enter a regular expression as fuzz sources. Using regular expressions is great for instances when you've got a numeric ID in an input and you want to try a range of other values. Say for instance you submitted a page in the app you're testing and you saw a UID=0013301 in the request. You could create a RegEx entry of [0-9][0-9][0-9][0-9][0-9][0-9][0-9], give it a description (i.e. "0000000 - 9999999"), click Add, then Close and use that regular expression to submit requests to the web server using all possible seven digit UID values. Yes, that's going to take some time.

In this example, however, we're not dealing with numeric input, so this doesn't really apply. Let's choose a file for input. A good file to use for this is All_attack.txt from www.neurofuzz.com. You'll want to run this file through awk to strip out the descriptions of the attacks contained in the file. Here's what the file looks like without being stripped:

A:::Meta-Character Injection
TRUE:::Meta-Character Injection
FALSE:::Meta-Character Injection
0:::Meta-Character Injection(Integer)
00:::Meta-Character Injection(Integer)
1:::Meta-Character Injection(Integer)
-1:::Meta-Character Injection(Integer)
1.0:::Meta-Character Injection(Integer)


It's nice to see what the different types of attacks are, but that's a lot of extraneous data to be carrying around during a pen test. No worries, fire up the following awk command:

awk -F: {'print $1'} All_attack.txt > all_attack
You'll now be left with a file that has one attack per line without the description of the attack. Adding this to your Sources in WebScarab is fairly intuitive so I won't cover it here. Once you've added the source, you can select that source as shown in the following screenshot:

After selecting the fuzz source, click the Start button on at the bottom of the window and WebScarb will start POSTing using your regular expressions or lines from your file as inputs. Yes, you can fuzz more than one field at a time, you can also set the priority so that you can different fuzz inputs for different fields and you can specify multiple sources.

What you won't see in WebScarab, is the response back from the server. For that, specify a save location for your WebScarab session data, open a terminal window in the "conversations" directory where the request and response data is saved and start parsing through the requests and responses using your favorite tools and techniques (grep is your friend) to see what inputs have what effect on the responses from the server. One quick way to isolate these is to look at file sizes for the responses. If you see big variations, you may have something interesting going on.

Sometimes WebScarab gets confused about the number of items it needs to fuzz. For example, the All_attack.txt file contains 362 lines as of this writing. But sometimes the fuzz template will submit a single request and quit. When that happens, reset the fuzz source in the drop down window for that parameter, click in the parameter field and then reselect the fuzz source. I've found that usually fixes the problem.

There are of course other fuzzing utilities available, some stand alone, some not. Burp Suite Pro has a built in fuzzer as well and it may very well be superior to the one in WebScarab. I have more experience with WebScarab so that's what I chose to use for this post. After I get some time in with Burp Suite Pro (gotta buy the Pro version), I'll post an entry on how it works.

Tuesday, April 7, 2009

SANS Forensics and Incident Response Summit 2009

I'm psyched to attend (and participate in) the SANS Forensics and Incident Response Summit 2009.

I've been editing and contributing to the SANS Forensics Blog since its inception in 2008, which has kept me in frequent contact with Rob Lee, Principal Consultant for MANDIANT, SANS Faculty Fellow and lead author for SANS' forensics track. As a result, I've known about the Summit for some time and knew there were some good things brewing. Needless to say, I was thrilled to be asked to participate.

I'll post additional information at a later date. It's late, I'm tired and need to be up way too early in the morning.

Wednesday, April 1, 2009

Computer Forensics Course in Kansas City

If you live in the Kansas City area and would like to learn or increase your existing knowledge of computer forensics, I'll be teaching SANS' Security 508: Computer Forensics, Investigation and Response at the KU Med Center, two hours a week for 10 weeks beginning May 14 through July 16.

I have taken many information security courses from a variety of vendors and SANS is by far my favorite. Their course are jam packed with useful information that even experienced professionals will be able to immediately apply and 508 maybe one of the most densely packed courses in the SANS curriculum.

Here's a list of some of the items covered:
  • NTFS, FAT32/16 and Ext2/3 file systems in depth
  • Acquisition and analysis of memory for responding to live systems
  • Acquisition of disk images, local and across the network
  • Timeline acquisition and analysis
  • A look at the different layers of information on a disk drive
  • Registry analysis
  • Application footprinting
  • A review of legal aspects relating to forensics and investigations
  • A comprehensive framework of the forensics process
  • And of course, much more
In addition the course comes with the SANS Investigative Forensics Toolkit, a VMWare image with all the best open source tools you need to conduct forensics investigations and of course the course books.

It's a great course. You're sure to learn loads of useful techniques and meet other info sec professionals from your area. Covering the material over 10 weeks gives you a great chance to digest the material over time. If you're interested, I encourage you to head over to the SANS web site and register. If you have any questions, please don't hesitate to contact me. And if you are interested, but can't take the course, check out the blog (see below) and join the community.

As for myself, I've been conducting incident response and forensic investigations for more than five years. I'm a regular contributor to and editor of the SANS Forensics Blog. I've taught this course before and received high marks from the students. I'll bend over backwards to make myself available outside of class time and invite all students to contact me with questions any time, even after the course is over. I want you to be successful, to learn and to have fun and I won't be satisfied unless those objectives are met.

Friday, March 13, 2009

I must remove this log from my eye



Yes, apparently, I'm selling snake oil. I wish I could sell more of it or at least find out where all the payments are going.

If you don't know what this is about, well, this is the kind of "beat down" one gets for speaking one's mind on the internet. More details can be found over at Mubix's Room362.com and here.

Mubix is a great guy and a big contributor to the information security community. He is the one behind The Podcaster's Meetups and single-handedly brought us the fire talks at this year's Shmoocon.

I had reservations about contacting Rob to begin with and if I had it to do over again, would choose a different path.

Thursday, February 19, 2009

Steven Branigan has a great blog post about how to pick a pen tester. Branigan's point is that the trustworthiness of the pen testing team should be the number one criteria. I agree, I applaud his post and it prodded me to follow up with this post about something that's been bugging me for the last few years.

I'm amazed at some of the highly regarded people in the information security arena who have publicly bragged about exploiting systems that they clearly did not have permission to exploit. They usually consider these exploits to be harmless pranks on systems of little import, but even the least important systems (kiosks seems to be a popular target) are still maintained by someone in IT somewhere and when those systems are compromised, they still have to pulled, re-imaged and redeployed, this takes time and costs money and there's an opportunity cost and these costs are passed along to the consumers.

As info sec professionals it is our job to defend information systems. During the course of our work, we learn how systems can be compromised. We have a responsibility to use that knowledge for good. If we take that knowledge and use it for pranks or to show off, then we become the very thing we are fighting against. We become a house divided.

It has been said many times before that the difference between a white hat and a black hat is permission. If you don't have permission and you're going around messing with vulnerable systems so that you can brag to your friends about it, then you've crossed the line and it gives a bad name to the entire industry. If you wanna play 733t hax0r and break into things, then you don't belong in the ranks of information security professionals.

All of this is not to say there is not a need for legit vulnerability research, but there is a difference between original research to discover new vulnerabilities, or even non-original research in a controlled environment where you are working with permission to learn a new exploit and the types of "bob stories" I'm referring to above. I'll even grant that there are some "bob stories" that demonstrate a vulnerability without going so far as to exploit the system in question and I have no issue with that. But if your "bob story" involves exploitation without permission, do yourself a favor and keep it to yourself, or better yet, think twice and exercise a little self-control, a little professional ethics and don't cross the line no matter how harmless you think your actions may be.

I'll get off my soap box now.

Sunday, February 8, 2009

Shmoocon remix

Last Friday my alarm woke me up a 3:30 a.m. I rolled out of bed, got dressed and hit the road for destination Shmoocon 2009. This is the first hacker con I've been too since a former employer sent me to Black Hat back in 2006.

Shmoocon exceeded my expectations. Here's a roundup of my experience at the con, no this list of talks attended is not in chronological order, it's been recompiled from my notes, the program and a post-con report I put together for my employer.

I attended a talk by Arshan Dabirsiaghi and Jason Li about AntiSammy. It's an OWASP project that offers an API for validating "rich" user input for sites that allow users to input HTML and CSS, but don't want to be vulnerable to XSS.

They briefly mentioned a tool that they were also releasing called Scrubbr. DBAs can point Scrubbr at MS SQL or MySQL databases and it will find fields that contain persistent XSS.

Julia Wolfe of Fire Eye spoke about the company's work to bring down the Srizbi botnet. It was obvious from her slides that they had done some really great analysis of the malware and inner workings of the botnet, very cool stuff.

Shane Lawson gave a hilarious presentation about defeating Kwikset's SmartKey. This one made me want a lock pick set to call my own.

I also introduced myself to Paul Asadoorian and Larry Pesce of PaulDotCom fame. Larry and David Lauer of the Security Justice Podcast showed off their ShmooBall Launchers.

After the days talks were over, I headed over to hear the Podcasters Meetup. It was a great time and lots of prizes were given out. I introduced myself to Mubix, he did a great job arranging the meetup and the after hours talks. I hope this can become a regular part of Shmoocon.

Following the fire talks, HacDC hosted a great after hours party that was around two miles from the hotel. I'm a regular walker. I try to get out for at least a three mile walk every day so I knew walking to the party would be a snap, navigating the whacked out streets of DC was a bit of an adventure. I live in a part of the country where the streets are on a nice east/west, north/south grid.

The party was in an old church and featured an open bar in exchange for donations at the door. Of course the highlight for me was meeting CrucialCarl, gattaca, rybolov and myrcurial at the bash. Rybolov brought his didgeridoo and flabongo to the party.

Walking home from the party was far more adventurous than walking to the party. I tried to walk back without the aid of my iPhone. Of course I missed a turn and ended up going a few blocks too far to the south, but I quickly got back on course and found the hotel without incident. To walk a city is to know a city.

On Saturday I sat through another great list of talks. Enno Rey and Daniel Mende of ERNW gave a pair of great talks. The first was on novel ways to build botnets. One technique involved finding and taking advantage of all the ignorant fools who have snmp devices unprotected on the internet with private strings exposed.

Their second talk of the day was on attacking internet backbone protocols. It's amazing how the same kinds of security issues that plague software development also affect core internet protocols and they are all based on misplaced assumptions about trust.

Shawn Moyer and Nathan Hamiel gave a hilarious talk about problems they've found in social networking sites. These two obviously love what they do, are good at their work and could likely make a second career in stand-up.

There was a lot of hallway talk and online bitching about Jay Beale's and his vaproware, Middler (yes it's out now so stop complaining). His talk was good and it was awesome to see this tool has finally been released, even if it is a bit rough around the edges. I have downloaded the code and have been working through it.

Prior to Jay's talk, I finally met hevnsnt and surbo from i-hacked. We had a nice lunch at the Indian restaurant a block from the hotel. Great couple of guys and I hope I can get more involved in all the stuff they are doing in the Kansas City area. If you haven't purchased a raffle ticket to help them cover the rent for their hacker space and possibly win yourself a MacBook, well, what are you waiting for. Go do it now!

I attended a few other talks, but am too exhausted now to keep going. I met a bunch of interesting and wicked smart folks that I've only known virtually and saw some old friends including Ed Skoudis and Kevin Johnson of InGuardians. I introduced myself to Johnny Long early on Sunday morning when he was setting up. He's everything everyone has ever told me he is, approachable, friendly and seems genuinely interested in what others have to say. I hope to get more involved with Hackers for Charity.

I paid my own way to Shmoocon and used vacation to cover my time off. I took good notes and when I returned to work, I wrote up a little two page report for my boss of all the things I learned or thought might help out our office. As a result, she rejected my time sheet and told me to replace my vacation days with non-project training. It would have been nice to get reimbursed for my travel and lodging, but I wanted to go to Shmoocon for myself and I'm glad I did and I'm looking forward to next year.

Paperclip Maximizers, Artificial Intelligence and Natural Stupidity

Existential risk from AI Some believe an existential risk accompanies the development or emergence of artificial general intelligence (AGI)...