Menu

#2 Multiple Iterations

open
None
7
2005-07-22
2005-06-01
No

This is a nice little utility. Perfect for those
occasions where my attempts to semi-manually sync files
across multiple machines creates some dupes.

One thing I've found, though, is that if I search a
directory, delete one of each duped pair that it finds,
and run the process again, more dupes are found. On
the latest directory, which had ~130 files, I went
through 4-5 iterations before all dupes were removed.
Log is attached.

Also, I find that duplifinder is too slow/bogs down on
my music folder, which is only ~1600 files. I end up
killing it after an hour or so.

Discussion

  • Brian Crounse

    Brian Crounse - 2005-06-01

    Log file for folder which required multiple runs

     
  • Marcus Wynwood

    Marcus Wynwood - 2005-07-22

    Logged In: YES
    user_id=1179924

    Thanks heaps for the bug report :-)
    I have noticed this problem too - I don't think it happens
    in version 0.09, so I'll have to maybe go back and work out
    whats happening.

    To make things quicker, I might use a faster hashing
    algorithm, maybe MD4.

    I'll get to fixing these problems soon :-)
    Thanks for using dupliFinder!
    - Marcus

     
  • Marcus Wynwood

    Marcus Wynwood - 2005-07-22
    • priority: 5 --> 7
     
  • Marcus Wynwood

    Marcus Wynwood - 2005-07-22
    • assigned_to: nobody --> mwynwood
     

Log in to post a comment.

MongoDB Logo MongoDB