Golang-based distributed web crawler management platform, supporting various languages including Python, NodeJS, Go, Java, PHP and various web crawler frameworks including Scrapy, Puppeteer, Selenium. Please use docker-compose to one-click to start up. By doing so, you don't even have to configure MongoDB database. The frontend app interacts with the master node, which communicates with other components such as MongoDB, SeaweedFS and worker nodes. Master node and worker nodes communicate with each other via gRPC (a RPC framework). Tasks are scheduled by the task scheduler module in the master node, and received by the task handler module in worker nodes, which executes these tasks in task runners. Task runners are actually processes running spider or crawler programs, and can also send data through gRPC (integrated in SDK) to other data sources, e.g. MongoDB.

Features

  • Task Scheduling
  • Worker Node Management and Communication
  • Spider Deployment
  • Frontend and API Services
  • Task Execution (you can regard the Master Node as a Worker Node)
  • Integration with Other Frameworks

Project Samples

Project Activity

See All Activity >

Categories

Web Scrapers

License

BSD License

Follow Crawlab

Crawlab Web Site

You Might Also Like
Top-Rated Free CRM Software Icon
Top-Rated Free CRM Software

216,000+ customers in over 135 countries grow their businesses with HubSpot

HubSpot is an AI-powered customer platform with all the software, integrations, and resources you need to connect your marketing, sales, and customer service. HubSpot's connected platform enables you to grow your business faster by focusing on what matters most: your customers.
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Crawlab!

Additional Project Details

Operating Systems

Linux, Mac, Windows

Programming Language

Python, PHP, Java, Go

Related Categories

Python Web Scrapers, PHP Web Scrapers, Java Web Scrapers, Go Web Scrapers

Registered

2023-01-05