[W3af-users] Crawl import issue
Status: Beta
Brought to you by:
andresriancho
From: Mohammad A. <al...@ya...> - 2015-06-18 14:10:38
|
Hello all, I have been testing the console version and discovered few areas that’s not working for me. It is possible that I am doing something wrong and wanted to reach out to you all. The application I am testing is very dynamic, almost entire UI is generated via JavaScript. Therefore, I had to use spiderman to generate a list of requests to feed the other plugins (i.e., audit, grep). I have used Cookie jar for authenticated access to bypass login issue (too complex with too many re-directs, csrf tokens). The commands are listed below. Here’s what’s not working: 1. Crawl import doesn’t seem to import the requests. I have not seen anything in the debug log to indicate otherwise. 2. The fuzz parameters don’t seem to do anything. I confirmed by running with and without the fuzz parameters. My expectation was the imported file (generated via spiderman) will be leveraged for the fuzz requests. I am unable to share actual log files due to confidentiality requirement. Any help would be much appreciated! ***** plugins crawl config import_results set input_csv /home/cay/in.csv back infrastructure afd,allowed_methods,fingerprint_os,server_header audit all,!memcachei,!preg_replace grep all output console,text_file,html_file output config text_file set output_file /home/cay/output.txt set http_output_file /home/cay/o set verbose True back output config console set verbose True back output config html_file set verbose False set template /home/cay/complete.html set output_file /home/cay/o.html back output config export_requests set output_file /home/cay/out.csv back back http-settings set cookie_jar_file /home/cay/cookie7 back misc-settings set max_discovery_time 20 set fuzz_cookies True set fuzz_form_files True set fuzz_url_parts True set fuzz_url_filenames True back target set target <url> back cleanup start exit |