Multi-connection command line tool to download Internet sites. Similar to wget and cURL, but it manages up to 50 parallel links. Main features are: recursive fetching, Metalink retrieving, segmented download and image filtering by width and height.
- up to 50 parallel downloads
- recursive fetching
- downloaded images selected by width and height
- Metalink retrieving
- easy and fast command line tool
- supported protocols: HTTP, HTTPS, FTP, FTPS, TFTP, TELNET, DICT and FILE
- single threaded
- more than 40 command line options
- static and dynamic libraries exported
- Large File Support (LFS)
- reporting capabilities
- make links relative when downloading HTML pages in order to use them locally
Thank you for your effort! It would be great to see one day a replacement of single-threaded wget in debian/ubuntu repos.
Love it! Thanks for working on great free software!