Can anyone suggest a solution to the following problem?
I download files with nget5 using a lengthy list of regexes (-r options), including some patterns designed to match many articles. After several GB have been acquired I move the files to offline media. The problem then is, if weeks or months later the same files are reposted in the NG, nget naturally downloads them again.
I don't want to have to edit my list of regexes to specifically exclude those files which have been moved offline.
A hack I tried was to replace the files in ~/.nget5 when moved offline with symlinks to a dummy empty file, which uses essentially no disc space. My idea being that nget5 would determine from the symlinks that it already had the file and would not attempt to re-download it. However what happens is that for a file eg. myfile.pdf previously downloaded, moved offline and replaced by 'ln -s dummy_empty_file myfile.pdf' , if reposted is that incomplete files of the form eg. myfile.pdf.1210008768.20025974 are downloaded and produced instead. This wastes disc space and nntp server download quota.
Is there a solution to this problem?
Details: nget5 version 0.27.1 via CVS, configured with pcre, Linux 2.6.20.
I should have added that dupeidcheck & dupefilecheck are both on (which are the default settings).
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.