ht://Check is more than a link checker. It's particularly suitable for checking broken links, anchors and web accessibility barriers, but retrieved data can also be used for Web structure mining. Uses a MySQL backend. Derived from ht://Dig.
WebWatcher - a Web-page Update Monitor This program will help you keep an eye on interesting Web-pages. You register a list of URLs you want to monitor, and WebWatcher checks for changes whenever you ask it to, or at given intervals. WebWatcher bases
Environment Control and Life Support System TCP/SSL/HTTPS Web Server Connection Monitor.Status Connection Evaluation and Monitoring Program of TCP family Servers. OpenSSL connection handling, XML Configuration and XSL Rendering.
The Backlinkchecker is used to analyze how many inbound links there are for your website. Its developed in PHP, JavaScript and MySQL. It uses the great Javascript lib extjs and the php lib snoopy.php.
The purpose of this project is to gauge website page loading speed as aspect of client through web management system, Leech. Leech says, "Leech bot reachs hosts via URL Information and sucks http responses!"
FoxBAT was made in an attempt to see if Naïve Bayesian filtering commonly used for spam filtering could be employed in the World Wide Web context. The application consists of a Firefox extension (.XPI package) and a Perl server script.
SpongeStats est un outil développé en PHP/MySQL/AJAX pour visualiser les statistiques de fréquentation et analyser le référencement d'un site Internet. SpongeStats is a analysis tools for your web site developped in PHP/MySQL/AJAX.
MWQL is an extension to MediaWiki, providing (end) users with a language for structural queries, so that they can build dynamic pages as seen in the Special pages of Wikipedia.
This project is designed to optimize search engine results by managing your web server sitemaps. The software combines both command line processes and a web user interface with a highly configurable architecture.
This is simple link checker. It can crawl any site and help to find broken links. It also having download CSV report option.The CSV file includes url ,parent page url and status of page [broken or ok]. It is be very useful for search engine optimization.
Aracnis is a Java based framework for building distributed web spiders. These spiders can be used to accomplish a variety of tasks, for example, screen-scraping and link integrity checking.
Java based API for use with Internet Explorer. Based on the JACOB project and the IE COM Object, it directly controls IE from java. This API can be used as a true end-user web browser test with IE and not a Http-Based test such as HttpUnit.
JLinkCheck is an Ant Task written in Java for checking links in websites. It is not just checking one single page, but crawling a whole site like a spider, generating a report in XML and (X)HTML. JReptator will be its succesor with many more features
Sperowider Website Archiving Suite is a set of Java applications, the primary purpose of which is to spider dynamic websites, and to create static distributable archives with a full text search index usable by an associated Java applet.
InSite is a Web site management tool written in perl. It checks link integrity and does some basic content monitoring of your site's files directly on the local disk, which gives it a huge speed advantage over similar tools.
Robust featureful multi-threaded CLI web spider using apache commons httpclient v3.0 written in java. ASpider downloads any files matching your given mime-types from a website. Tries to reg.exp. match emails by default, logging all results using log4j.