Challenges faced by automated web application security assessment tools By Robert Auger (11/11/2006)
There are many challenges that web application security scanners face that are widely known within the industry however may not be so obvious to someone evaluating a product. For starters if you think you can just download, install, and run a product against any site and get a report outlining all of its risks you'd be probably be wrong. These sorts of tools are often configured as best as possible for most sites, however may not be configured 'out of the box' the best way for YOUR site. In defense of these vendors they can't possibly know every single situation that can occur since site behaviors aren't strictly defined as industry standards, and are often improvised for each site's unique needs. Good tools should allow you to configure various options so that you can properly adjust them for your site. Below are some of the top issues that these products face which may hinder you from performing an automated assessment against your own site.
Session State Management
Cookies and other state tracking mechanisms (such as session identifiers embedded in a URL) are used to track who you are and what you're doing on a site. This is by far one of the most difficult problems facing anyone auditing a web based application. For vendors this is particularly difficult being that developers implement session tracking in their own way. A common problem that automated products face involve trying to stay 'logged into' a website. When you send an attack against an application parameter you may end up invalidating your session token which can cause you to become logged out. Another problem arises when you have multiple requests sharing a 'session token' being sent at the same time. They often invalidate themselves during the attack phase and you end loosing attack requests, or resending them. Typically the best approach to this problem is to throttle down these products and ensure only 1 request is being sent at a time. The disadvantage of 1 request at a time is the length in which the scan runs for. If you have the patience and need to scan a site requiring logging in, this may be your best bet.
Some websites require a visitor to traverse a site in a certain order before allowing them to perform a function. A good example of this would be the 'checking out' feature of a website and placing an order. Crawlers on the other hand simply fetch a page, identify links in it, and fetch them and don't have the concept of filling the shopping cart before attempting to check it out. The OWASP guys have released vulnerable website package known as WebGoat which points out some of these logical order flaws. At this time websites implementing these sorts of behaviors are best left to the humans.
Certain websites contain simple urls such as http://host/foo.php?id=1 while others contain urls that seem to reach the bounds of one's imagination. One of the biggest challenges that you can face while auditing a web based application, is identifying parameters to perform attacks against. Since there isn't any enforcement of what urls should look like people do whatever they like. Examples of oddball things that people do may include using customer delimiters also known as parameter separators. An example url of index?name=foo&age=12 could be converted to index?name=fooQWEage=12 where QWE is the customer delimiter. Some applications may modify their url structure based off of the value of a single parameter. For example '/index?func=View' may perform a function however if changed to '/index?func=Sell' may require an additional parameter named "id" requiring the user to request '/index?func=Sell&id=12345. If your website doesn't follow 'normal' looking url structuring, manual work may be required.
Wouldn't it would be nice to provide a list of site accounts to someone/thing with their access levels, and get a list of what each account can actually do? It sure would! Unfortunately 'at this time' the only way you're going to pull this off is by using a human. Applications are often to custom to be easily automated in this way which is a substantial limitation facing automated scanning products. Automating escalation testing of certain popular commercial packages (for starters) is something that I hope we'll see in the near future.
I admit it can be difficult writing a vulnerability signature that will accurately flag an issue without necessarily exploiting it. Certain vulnerabilities are tricky to reproduce every time and because of this may have signatures created based off of behavior, or a version banner. Depending on the vulnerability actively exploiting it could case an application to break or work in undesirable ways. General advice is to not scan production sites but unfortunately not everyone has this luxury. Due to the nature of certain vulnerabilities, false positives are here to stay.
While this document isn't an extensive review of all the weaknesses (or strengths for that matter) these products face, I do hope that it adjusts the readers expectations of using these tools and provides them with a list of concerns and issues that will 'probably' come up during an evaluation. Full disclosure, I have worked for such a vendor in the past however this document does not reflect against any particular product or company. Comments and questions may be directed to the author of this document using the comment form below.