This is a nice little utility. Perfect for those
occasions where my attempts to semi-manually sync files
across multiple machines creates some dupes.
One thing I've found, though, is that if I search a
directory, delete one of each duped pair that it finds,
and run the process again, more dupes are found. On
the latest directory, which had ~130 files, I went
through 4-5 iterations before all dupes were removed.
Log is attached.
Also, I find that duplifinder is too slow/bogs down on
my music folder, which is only ~1600 files. I end up
killing it after an hour or so.
Log file for folder which required multiple runs
Logged In: YES
user_id=1179924
Thanks heaps for the bug report :-)
I have noticed this problem too - I don't think it happens
in version 0.09, so I'll have to maybe go back and work out
whats happening.
To make things quicker, I might use a faster hashing
algorithm, maybe MD4.
I'll get to fixing these problems soon :-)
Thanks for using dupliFinder!
- Marcus