Finds all the security information for a given domain name
Domain analyzer is a security analysis tool which automatically discovers and reports information about the given domain. Its main purpose is to analyze domains in an unattended way.
A function-testing, performance-measuring, site-mirroring, web spider that is widely portable and capable of using scenarios to process a wide range of web transactions, including ssl and forms.
Methanol is a scriptable multi-purpose web crawling system with an extensible configuration system and speed-optimized architectural design. Methabot is the web crawler of Methanol.
Looker is an enterprise platform for BI, data applications, and embedded analytics that helps you explore and share insights in real time.
Chat with your business data with Looker. More than just a modern business intelligence platform, you can turn to Looker for self-service or governed BI, build your own custom applications with trusted metrics, or even bring Looker modeling to your existing BI environment.
Crawler.NET is a component-based distributed framework for web traversal intended for the .NET platform. It comprises of loosely coupled units each realizing a specific web crawler task. The main design goals are efficiency and flexibility.
Larbin is a Web crawler intended to fetch a large number of Web pages, it should be able to fetch more than 100 millions pages on a standard PC with much u/d. This set of PHP and Perl scripts, called webtools4larbin, can handle the output of Larbin and p
Web Textual eXtraction Tools C++ Parallel web crawler, noun phrase idenification, Multi-lingual Part of Speech Tagging, Tarjan's Algorithm, Co-RelationShip Mappings...
Larbin is an HTTP Web crawler with an easy interface that runs under Linux. It can fetch more than 5 million pages a day on a standard PC (with a good network).
Venn isolates and protects work from any personal use on the same computer, whether BYO or company issued.
Venn is a secure workspace for remote work that isolates and protects work from any personal use on the same computer. Work lives in a secure local enclave that is company controlled, where all data is encrypted and access is managed. Within the enclave – visually indicated by the Blue Border around these applications – business activity is walled off from anything that happens on the personal side. As a result, work and personal uses can now safely coexist on the same computer.
Harvest is a web indexing package, originally disigned for distributed indexing,
it can form a powerful system for indexing both large and small web sites.
Also now includes Harvest-NG a highly efficient, modular, perl-based web
crawler.