A toolchain for bringing web2 to web3. IPFS daemon should be launched. Download enwiki-latest-all-titles to crawler root dir. Basically, there are two main functions provided by crawler tool. The first one is to parse wiki titles and submit links between keywords and wiki pages. Also, crawler has separate command upload-duras-to-ipfs to upload files to local IPFS node. All DURAs are collected under single root unixfs directory.
Features
- Note: Requires Go 1.12+
- IPFS daemon should be launched
- Download enwiki-latest-all-titles to crawler root dir
- Basically, there are two main functions provided by crawler tool
- Parse wiki titles and submit links between keywords and wiki pages
- Upload duras to IPFS
Categories
File SharingLicense
MIT LicenseFollow crawler
You Might Also Like
Boost application security and continuity with SKUDONET ADC, our Open Source Load Balancer, that maximizes IT infrastructure flexibility. Additionally, save up to $470 K per incident with AI and SKUDONET solutions, further enhancing your organization’s risk management and cost-efficiency strategies.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of crawler!