This is separate from the in-place repair, so I wanted to open another thread.

If you're like me, when you're downloading a large set of files off of Usenet, the first thing you do is grab the .PAR2 file and open up QuickPAR in monitor mode.  This way, as the files come down the pipe, they get analyzed as soon as they are available, and once all the files are down I can see right away how much recovery data is needed.

I'm wondering how much processing can be done on data blocks as they are downloaded, before a full recovery set is available, such that recovery time once all the parity was available could be shortened?

I know today a small amount of caching is done to mark which files are damaged or not so that if QP is closed when you re-open it remembers without having to re-scan everything. (see recent post)

If there were any additional calculations that could be done as the files were downloaded to speed up the recovery process once the full recovery set is availble, my guess is that even if this required temporary storage space (KBs/block) this would be a valuable option.

This may be a complete non-started based on technology limitations, just wanted to bring it up.

Thanks!