Web Crawler Security Tool

beta

A web crawler oriented to information security.

5.0 Stars (2)
12 Downloads (This Week)
Last Update:
Download crawler_v1.0.1.tar.gz
Browse All Files
Linux

Screenshots

Description

Last update on tue mar 26 16:25 UTC 2012

The Web Crawler Security is a python based tool to automatically crawl a web site. It is a web crawler oriented to help in penetration testing tasks. The main task of this tool is to search and list all the links (pages and files) in a web site.

The crawler has been completely rewritten in v1.0 bringing a lot of improvements: improved the data visualization, interactive option to download files, increased speed in crawling, exports list of found files into a separated file (useful to crawl a site once, then download files and analyse them with FOCA), generate an output log in Common Log Format (CLF), manage basic authentication and more!

Many of the old features has been reimplemented and the most interesting one is the capability of the crawler to search for directory indexing.

Web Crawler Security Tool Web Site

Features

  • Crawl http and https web sites (even web sites not using common ports). Crawl http and https web sites (even web sites not using common ports). Crawl http and https web sites (even web sites not using common ports).
  • (new!) It allows to determine the depth of the crawling (-C <depth> option)
  • (new!) Generates a summary at the end of the crawling with statistics about the crawl results
  • (new!) Implemented HEAD method for analysing file types before crawling. This feature improves the speed of the crawler significantly.
  • Uses regular expressions to find 'href', 'src' and 'content' links.
  • Identifies relative links.
  • Identifies non-html files and shows them.
  • Not crawl non-html files.
  • Identifies directory indexing.
  • Crawl directories with indexing (not yet implemented in v1.0)
  • Uses CTRL-C to stop current crawler stages and continue working. Very useful stuff...
  • Identifies all kind of files by reading the content-type header field of the response.
  • Exports (-e option) in a separated file a list of all files URLs found during crawling.
  • Select type of files to download (-d option). Ex.: png,pdf,jpeg,gif or png,jpeg.
  • Select in an interactive way which type of files to download (-i option).
  • Save the downloaded files into a directory. It only creates the output directory if there is at least one file to download.
  • Generates a output log in CLF (Common Log Format) of all the request done during crawling.
  • (beta) Login with basic authentication. Feedback is welcome!
  • Tries to detect if the website uses a CMS (like wordpress, joomla, etc) (not yet implemented in v1.0)
  • It looks for '.bk' or '.bak' files of php, asp, aspx, jps pages. (not yet implemented in v1.0)
  • It identifies and calculates the number of unique web pages crawled. (not yet implemented in v1.0)
  • It identifies and calculates the number of unique web pages crawled that contains parameters in URL. (not yet implemented in v1.0)
  • It works in Windows, but didn't save results yet

Update Notifications





User Ratings

★★★★★
★★★★
★★★
★★
2
0
0
0
0
ease 1 of 5 2 of 5 3 of 5 4 of 5 5 of 5 0 / 5
features 1 of 5 2 of 5 3 of 5 4 of 5 5 of 5 0 / 5
design 1 of 5 2 of 5 3 of 5 4 of 5 5 of 5 0 / 5
support 1 of 5 2 of 5 3 of 5 4 of 5 5 of 5 0 / 5
Write a Review

User Reviews

  • nakintotheworld
    1 of 5 2 of 5 3 of 5 4 of 5 5 of 5

    It's a fast web crawler, detects directory indexing and allow users to download found files witch can be used to to analyze metadata. Very useful in information gathering.

    Posted 05/27/2011
  • sebagarcia
    1 of 5 2 of 5 3 of 5 4 of 5 5 of 5

    Its the fastest web crawler out there!

    Posted 05/22/2011
Read more reviews

Additional Project Details

Intended Audience

Information Technology

User Interface

Console/Terminal

Programming Language

Python

Registered

2011-05-18
Screenshots can attract more users to your project.
Features can attract more users to your project.

Icons must be PNG, GIF, or JPEG and less than 1 MiB in size. They will be displayed as 48x48 images.