Saturday, October 25, 2008

Computer Forensics, Investigation and Response


I'm excited. For 10 weeks this summer I was privileged to teach SANS Security 508: Computer Forensics, Investigation and Response via the Mentor program. It is one of my favorite SANS courses for its depth and the extensive hands-on exercises. Unlike other forensics courses that teach specific tools without getting into what's going on behind the scenes, this course pulls back the curtain with an in depth look at different file systems and how they store and organize data on disk.

Once we've covered the foundational materials we introduce a comprehensive methodology that covers all the important aspects of conducting a successful investigation. There's even a day focused on legal issues.

If there's one problem with the course, it's the sheer volume of information to be digested. One nice thing about covering it over 10 weeks, as opposed to in six days, is that you get more time to take it all in, try things out, absorb the content and experiment with the tools and concepts.

I'm excited because I get to do it again starting in January. Full details are available at http://www.sans.org/mentor/details.php?nid=14464. If you live in the Kansas City area and are interested, please check it out. If you know someone else who may benefit from this, please spread the word.

Saturday, October 11, 2008

As seen on Twitter

# mdowd @alexsotirov The first year online voting is allowed, a cartoon character is going to win by 43 billion votes about 4 hours ago

# mdowd @alexsotirov Because I'm not interested in the common good! about 4 hours ago

# alexsotirov Why is it that all these source code audits of voting machines are done by university professors instead of Mark Dowd? about 4 hours ago

Tuesday, October 7, 2008

SANS' Web App Pen Testing In Depth Day Four

Day four rocked. It was exploit day and sure all the stuff leading up to exploitation is important, but there's nothing like the joy derived from breaking stuff. And the exercises were more complicated, which I think it good and I may be wrong about this, but I think there were more exercises on day four.

I played around with some interesting tools that I hadn't used before, BeEF for one, and Kevin talked about some tools and ideas that are being developed by him and his colleagues at InGuardians. What a great bunch of minds at InGuardians. I aspire to be like the folks in that company and to work with a similar group of people.

The class wrapped up with an overview of the materials and the process. I'm excited for Kevin going forward. He's put together a good course and it's only going to get better when the six day version comes out.

I stand by my earlier statements about Kevin. He's a great teacher and a judging from the two nights I had the good fortune to have dinner with him and hangout for a bit, he's a quality human being. I've had some brilliant instructors over the years and they knew it and the result was that they were not very approachable. Kevin is a fantastic instructor and two days in a row invited anyone from the class to join him for dinner.

If you're looking to get started in web app pen testing, or you've been doing it for a little while and aren't sure about your methods, I strongly recommend this course. I had some experience with web app pen testing prior to taking the course. The result for me was that the first couple of days were mostly review with a few new nuggets here and there, but day three and especially day four really broke some new ground for me. The entire course also validated my own methods and as an aspiring instructor, it was great to watch Kevin teach the class. He's a natural and I am looking forward to seeing him again at a future con. I hope I can make it to Shmoo in February and maybe catch him there if he attends.

Sunday, October 5, 2008

Day Three of SANS' 542: Web App Pen Testing In Depth

Day three was a run through of discovery. Discovery is the process of finding vulnerabilities in the web application, but stopping short, for the most part, of actually exploiting vulnerabilities.

By the way, it's really late and I should be asleep, so I'm keeping this post short. Or I intend to.

We looked at information leakage, this is one of the things I find most often in my own testing. Developers allow their applications to throw errors back to a user and the errors leak information about the implementation of the application, such as what OS, backend DB or other components of the system. Yes, this is bad. You should be returning generic error messages. Preferably something that has the right look and feel for your application. Many developers I work with on a regular basis simply redirect the user back to the start of the application when the app throws an exception. This is not as bad as leaking information, but it sucks for usability. Don't do it that way.

We looked at username harvesting. This is something I find quite a bit in my work. And it's a difficult problem to overcome if you spend much time thinking about ways to mitigate it. If you have an app where people can register for a new account, it's hard to prevent username harvesting for obvious reasons. Password resets, security questions and the like are another area where username harvesting is pretty common, but generally is more preventable. Account registration and creation is the biggie.

I brought this up with Kevin and he had an excellent suggestion. Don't prevent it. Detect it and block the attack. I'll be writing this up as a recommendation in the future.

We looked at fuzzing applications using the Burp Suite, talked about Absinthe. I wish there would have been an exercise for Absinthe. I have it installed on my pen testing box, but haven't used it yet.

Greasemonkey was introduced. I love Greasemonkey, though I've never used it for pen testing. I find it really useful for adding functionality to web interfaces. It rocks.

The last part of the day was a review of some of the newer developments on the web; namely Web Services Definition Language, Universal Description, Definition, Integration Specification (UDDI), Simple Object Access Protocol, AJAX and JSON. Frankly, I wish we could have spent an entire day on these areas alone.

You've all heard of Web 2.0. AJAX and JSON are two of the core components that drive Web 2.0, but many larger enterprises are only now beginning to role them out so many web app pen testers don't have much experience with them, including yours truly. I could use more info on these technologies and have it on my list to find out as much as I can in the coming weeks.

All in all, another good day. Again, I wish the course had more hands-on exercises and as I've mentioned previously, I know it's coming in the six day version of the course. In fact, I had dinner with Kevin Johnson, the author of the course and a couple other students and Kevin talked about how day six of the course is going to be a full on web app pen test exercise from start to finish. If it includes all the whiz-bang Web 2.0 aspects, that will be really beneficial.

Now if you'll excuse me, I should have been asleep a couple hours ago. Good night now!

Saturday, October 4, 2008

Day Two: SANS Network Security 2008

Today was day two of SANS' Web App Penetration Testing In-Depth. The focus of day two was information gathering.

Scouting your client is an essential part of the process and the course presents methods and tools for accomplishing this. Google is your friend. Johnny Long's Google Hacking For Penetration Testers will be a trusted companion. Some less well-known, but highly useful methods were covered. If you've studied pen testing, you may have knowledge of these. If it seems I'm being vague, that's because I am. I respect the work that Kevin has put into developing the course and I'm not going to give it all away.

I do wish the course had more hands-on with some of the tools that are presented. I know from talking to Kevin that the course is going to be expanded. Perhaps the expanded version will include more exercises with some of these tools.

After gathering useful info about the company, we drill down and gather details about the technologies that are being used. The usual tools for gathering information about systems are covered, but there were a few that I have not used including RSnake's Fierce.

There was a discussion of some issues relating to web application server architecture and some of the caveats that can throw off a pen tester and ways to work around those obstacles.

We continued to drill down, from the servers to the apps on those servers and ways to gather useful information about those applications. Most of the tools discussed were ones I'd had experience with before. One exception was OWASP's DirBuster. I'd seen it before, but never bothered to try it out. Now I have and will incorporate it into my testing.

All in all, another good day in class and I'm really looking forward to tomorrow when we enter the discovery phase where we will uncover weaknesses in the app.

Friday, October 3, 2008

SANS Network Security 2008

I don't want to come off like a fan boy, but I've been taking training in the information technology arena for more than a dozen years and from a variety of different sources. SANS is better than any other organization I've trained with. In the interest of full-disclosure, I have participated in SANS' Mentor program, but am not an employee nor do I have any affiliation beyond that.

I am currently in Las Vegas at Network Security 2008 attending Web Application Penetration Testing In-Depth developed and taught by Kevin Johnson of InGuardians, developer of BASE, Samurai and many other Open Source projects.

I wasn't sure I should take the course. I've been doing web app pen tests for a while. By no means am I an expert and I don't claim to know all there is to know, but I wasn't sure I would get enough from the course to make it worth my while. I'd say on a scale of one to five, five being an expert, I'm probably almost a four. You should know that one of my many flaws is that I consistently underestimate my abilities.

Day one didn't teach me very many new things about web application pen testing, but there were a few nuggets. However, based on day one, I am confident that over the next three days I will pick up many great insights that make me more effective.

Johnson has put a tremendous effort into the course materials and he may be the best instructor I've ever had. He has a very friendly and knowledgeable approach. He's clearly a subject matter expert, but he has the right amount of self-effacing humor.

Based on what I've seen thus far, being in this course is going to have two great benefits. I will learn to be a better web app pen tester and will learn how to improve my teaching skills.

Wednesday, October 1, 2008

Feldman's product or idea maturity model

Photo courtesy of cybreton at flickr.com http://www.flickr.com/photos/cybreton/
I was listening to Gary McGraw's Silver Bullet Security Podcast Show 002 where Dan Geer was the victim.

If you haven't listened to the Silver Bullet Podcast, it is a series of interviews with information security luminaries. I find most of the guests to be fascinating and Geer was obviously no exception. He had many great things to say.

One of the things Geer mentioned during the show was that he used to attend talks by Stu Feldman a computer scientist forged in the bowels of Bell Labs, what an amazing place that must have been to work. Feldman is the creator of the make utility and is currently the VP of Engineering at Google.

Apparently Feldman had this concept for evaluating the maturity or quality of an idea, concept or product. After the show, I searched for it online and couldn't find it, so I thought I'd share it here in hopes that maybe Google will index it and others could find it later and know who to attribute it to.

Feldman's model has five levels so they can be tracked on one hand. Here it is:

1. You have a good idea.
2. You can make it work.
3. You convince a gullible friend to try it.
4. People stop asking why you're doing it.
5. People start asking others why they aren't doing it.

That's it Feldman's method for evaluating the maturity of a product or idea. With apologies to Mr. Feldman for publishing this without his permission.

Paperclip Maximizers, Artificial Intelligence and Natural Stupidity

Existential risk from AI Some believe an existential risk accompanies the development or emergence of artificial general intelligence (AGI)...