crawlergo is a browser-based web crawler designed to collect URLs and request data that can be used by web vulnerability scanning tools. It uses a Chrome headless environment to render web pages and observe behavior during the DOM rendering stage in order to capture as many accessible endpoints as possible. By monitoring the page lifecycle and interacting with web elements, the crawler automatically triggers JavaScript events and navigational actions that would normally occur during real user interaction. It also automatically fills and submits forms, helping discover hidden routes or parameters that might otherwise be missed by traditional crawlers. crawlergo includes a built-in URL de-duplication system that removes repeated or pseudo-static links while maintaining fast crawling speeds for large websites. crawlergo also analyzes page content to extract links and resources from multiple sources, including JavaScript files, comments, and configuration files.
Features
- Chrome headless browser rendering for accurate page crawling
- Automatic form filling and submission to discover hidden endpoints
- Full DOM event collection with automatic event triggering
- Intelligent URL de-duplication to remove redundant requests
- Extraction of URLs from JavaScript, comments, and robots files
- Ability to send collected requests to passive vulnerability scanners