Golang-based distributed web crawler management platform, supporting various languages including Python, NodeJS, Go, Java, PHP and various web crawler frameworks including Scrapy, Puppeteer, Selenium. Please use docker-compose to one-click to start up. By doing so, you don't even have to configure MongoDB database. The frontend app interacts with the master node, which communicates with other components such as MongoDB, SeaweedFS and worker nodes. Master node and worker nodes communicate with each other via gRPC (a RPC framework). Tasks are scheduled by the task scheduler module in the master node, and received by the task handler module in worker nodes, which executes these tasks in task runners. Task runners are actually processes running spider or crawler programs, and can also send data through gRPC (integrated in SDK) to other data sources, e.g. MongoDB.

Features

  • Task Scheduling
  • Worker Node Management and Communication
  • Spider Deployment
  • Frontend and API Services
  • Task Execution (you can regard the Master Node as a Worker Node)
  • Integration with Other Frameworks

Project Samples

Project Activity

See All Activity >

Categories

Web Scrapers

License

BSD License

Follow Crawlab

Crawlab Web Site

Other Useful Business Software
MongoDB Atlas runs apps anywhere Icon
MongoDB Atlas runs apps anywhere

Deploy in 115+ regions with the modern database for every enterprise.

MongoDB Atlas gives you the freedom to build and run modern applications anywhere—across AWS, Azure, and Google Cloud. With global availability in over 115 regions, Atlas lets you deploy close to your users, meet compliance needs, and scale with confidence across any geography.
Start Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Crawlab!

Additional Project Details

Operating Systems

Linux, Mac, Windows

Programming Language

Go, Java, PHP, Python

Related Categories

Python Web Scrapers, PHP Web Scrapers, Java Web Scrapers, Go Web Scrapers

Registered

2023-01-05