This is sort of half feature request, half support request. I'd like to distribute software to lab machines. This involves getting a multiple-gigabyte set of files (say, Adobe Creative Suite installer) from a network share onto a local cache folder on an individual machine.
Currently, we use robocopy, but that tends to time out after a while, and it puts a lot of load on the server. I've had some success using aria2c with torrents, and as long as there are enough machines in the swarm, it works well, but it degrades poorly. My best results have been to have each machine mount the target directory over SMB, then self-seed that mounted directory on localhost, and use a second aria2c to download to localhost (and incidentally, torrent to any other machine in the vicinity using a Web seed). This works pretty well for the most part, though I have no good way for instance of limiting it to a certain number of seeds to prevent overloading the central file server.
However, in the case where only one or a few machines needs to download the files, it sucks. It times out really easily and does not maximize the bandwidth available (because torrents are limited to one TCP stream).
So, I've been wondering, if it would be possible to tell aria2c to simultaneously do a file copy and a torrent, that is, can it simultaneously seed and copy a file off a network share. In my ideal world, I'd like to tell aria2c to get the file from both sources at once, save to a local copy, and share the file pieces to any peer belonging to the swarm.
So in a computer lab, if one computer downloads the files, it gets the file entirely with file copy. If a peer joins the swarm, then any pieces that computer A has downloaded will be shared with computer B, as well as B downloading the file. If there are (say) 20 seeds connected to the downloader already, it could rely entirely on torrent and ignore the file share.
Alternatively, it would work if aria2c could seed and download with multiple TCP streams to localhost. Right now, the self-seeding can only do one chunk at a time. If I could otherwise saturate the network bandwidth available to the single download case, that would also help.
If I'm missing something fundamental, feedback would be appreciated.