Easy Spider is a distributed Perl Web Crawler Project from 2006. It features code from crawling webpages, distributing it to a server and generating xml files from it. The client site can be any computer (Windows or Linux) and the Server stores all data.
Websites that use EasySpider and Perl/PHP Backends:
Webcrawlers are mostly the first thing to start programming at if you start your programming career. It is fun to look at some code that is few years ago and to see how one has improved himself.
(c) Sebastian Enger 2005-2015
- Client/Server Distributed Crawling
- Perl Programming Language
- Config File Support
- PDF,DOC,XLS,PPT Extraction Support
Follow Easyspider - Distributed Web Crawler
Rate This ProjectLogin To Rate This Project
There are no 1 star reviews.