Golang-based distributed web crawler management platform, supporting various languages including Python, NodeJS, Go, Java, PHP and various web crawler frameworks including Scrapy, Puppeteer, Selenium. Please use docker-compose to one-click to start up. By doing so, you don't even have to configure MongoDB database. The frontend app interacts with the master node, which communicates with other components such as MongoDB, SeaweedFS and worker nodes. Master node and worker nodes communicate with each other via gRPC (a RPC framework). Tasks are scheduled by the task scheduler module in the master node, and received by the task handler module in worker nodes, which executes these tasks in task runners. Task runners are actually processes running spider or crawler programs, and can also send data through gRPC (integrated in SDK) to other data sources, e.g. MongoDB.

Features

  • Task Scheduling
  • Worker Node Management and Communication
  • Spider Deployment
  • Frontend and API Services
  • Task Execution (you can regard the Master Node as a Worker Node)
  • Integration with Other Frameworks

Project Samples

Project Activity

See All Activity >

Categories

Web Scrapers

License

BSD License

Follow Crawlab

Crawlab Web Site

Other Useful Business Software
Go From AI Idea to AI App Fast Icon
Go From AI Idea to AI App Fast

One platform to build, fine-tune, and deploy ML models. No MLOps team required.

Access Gemini 3 and 200+ models. Build chatbots, agents, or custom models with built-in monitoring and scaling.
Try Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Crawlab!

Additional Project Details

Operating Systems

Linux, Mac, Windows

Programming Language

Go, Java, PHP, Python

Related Categories

Python Web Scrapers, PHP Web Scrapers, Java Web Scrapers, Go Web Scrapers

Registered

2023-01-05