Quoting "Ankam, Arunkumar" <aankam@...>:
> Hi Rogan,
> Thanks for immediate reply.Your answers were very much
> useful to me. Can u please tell me how all the 10 owasp points (from A1
> to A10?) Can be achieved for those things which don't have form based
> input.Please Reply as early as possible.
> Arun kumar.
A1. Send all sort of funny input to the server using WebScarab form intercepts,
manual requests, or scripting.
A2. Use the spider, the manual request and the scripting plugin to make requests
to identified resources, removing critical authentication parameters.
A3. Use the sessionid plugin to identify non-random sessionids. Replay requests
after a suitable timeout period to verify that the session does in fact
A4. Similar to A1.
A5. Similar to A1, with long inputs.
A6. Similar to A1, focusing on apps that seem to interact with underlying
A7. Use the scripting plugin to "post-process" the observed responses to
identify any that appear to contain errors. Also look to 5xx responses in the
A8. Not addressed.
A9. Not addressed.
A10. Not addressed.
P.S. Please keep traffic ON the list. That way I do not duplicate my responses,
and the other people on the list, and those searching later may find the
answers as well.
> -----Original Message-----
> From: lists@... [mailto:lists@...]
> Sent: Tuesday, March 15, 2005 2:02 AM
> To: Jamie Finnigan
> Cc: Ankam, Arunkumar; owasp-webscarab@...
> Subject: Re: [OWASP-WEBSCARAB] whether webscarab supports Cross site
> scripting,Buffer Overflows,SQLInjection,Forced Browsing
> Quoting Jamie Finnigan <jfinnigan@...>:
> > Hi Arun,
> > What you have listed are a number of common web application
> > vulnerabilities. As a proxy tool, Webscarab supports pretty much
> > anything web-application-related, allowing you to test for any of
> > these vulnerabilities.
> > Given the nature of the beast (a 'proxy' rather than an 'automated
> > application test tool' or similar) the test process is initiallly
> > manual.
> > For example, to test for buffer overflows:
> > - install Webscarab, run it, and configure Internet Explorer (or your
> > browser of choice) to use it as the proxy
> > - browse to a page within the application that contains the form
> > input you are testing in the browser and watch the conversation
> > happening in the log screen (under the 'Summary' tab)
> > - turn on intercepts (under the 'Proxy' and 'Manual Edit' tabs) and
> > submit the form
> > - review the request as that pops up in the intercepts window, and
> > manipulate input as you wish (eg. you might choose to change the value
> > for a field name 'userid' from 'arunk' to
> > '9999999999999999999999999999999999999999999'
> > - submit the modified request and observe the results in your browser
> > and in the conversation log
> > - repeat as considered necessary .. it's normally an interative
> > process, repeatedly testing different input values on the same input
> > field
> > A similar process will be used when testing for cross site scripting
> > and SQL injection, except with different values. For example:
> > - to test for cross site scripting you might replace the value of the
> > 'userid' input field 'arunk' with '<i>arunk</i>' or '<iframe
> > src=http://www.google.co.nz></iframe>'
> > - to test for SQL injection you might replace the value of the
> > 'userid' input field 'arunk' with 'arunk" or 1=1;--' or similar.
> > Once you have defined your testing process for an application you
> > could look at automating some of the testing using the scripting
> > functionality built in to Webscarab.
> > Hope this helps...
> > Jamie
> Thanks for the response, Jamie.
> To add to it, maybe a bit of philosophy about WebScarab, and some idea
> of where
> it is going (i.e. what I am working on at the moment!)
> The idea of WebScarab is to be a user interface to allow users to
> visualise the
> web site under review, and to interact with the website in a number of
> different ways.
> One way is to let the browser generate the requests that we send to the
> under review. As mentioned by Jamie, this is a nice way of exploring the
> and identifying applications, because the browser does a lot of the hard
> of letting you navigate around.
> Another way is to create manual requests, either by hand crafting them
> in the
> "Manual Request" plugin, or by editing and replaying a previous request
> selected from the dropdown list.
> Another way is to let the Spider create requests based on the links that
> it has
> identified in the HTML responses it has seen, and that have not yet been
> The last of the current methods of interacting with the site is using
> "Scripted" plugin, which allows you to programmatically create requests
> are submitted to the server, and do some analysis of the responses that
> back. This is really great for things like fuzzing, as you can create
> requests based on a list of fuzz strings designed to generate XSS or SQL
> Injection errors. It also allows you to multi-thread the requests, so it
> can be
> a LOT faster than, say, shell and netcat. In the latest version released
> Sourceforge (but not actually announced here!), I have enhanced the
> capabilities of the Scripted plugin, so that you can explore the
> existing site
> model/tree hierarchy, and build your requests automatically.
> More excitingly, I am working on a new Fuzzer plugin, which
> identifies URL's that could possibly be applications, lists the various
> permutations of parameters that the URL has received, and will provide
> level of automation of submitting fuzz strings to the various
> applications that
> have been identified. It will also identify pages that contain error
> (based on regex matches, most likely), which will help to identify pages
> need more manual attention.
> I hope to present this new plugin at the OWASP Conference in London, so
> anyone is going to be there, it will be great to put faces to names, or
> names to aliases! ;-)