Extremely fast. Extremely helpful. I am trying to recover several disks that had many files. I raw copied the disk contents to a new disk. Some file names and directory locations were lost during the recovery. Files had names like 003492232 . There were many duplicates. "Duplicate Files Finder" helped me easily delete a large amount of duplicates (thousands of files). The left over files (non-duplicate files) I can manually sort and restore. Thanks "Duplicate Files Finder" ! -J_Tom_Moon_79
Does a great job finding the duplicate files, however deleting the duplicates is a manual process. For every file that has a duplicate, you have to manually select which one you want to keep. In my case, I don't care which duplicates are deleted, providing that one is kept - but it doesn't give me that option. I also struck some strange errors where it would refuse to delete a file, for no particular reason.
F4cio - I was also looking for a quick tool that supports long paths and duplicate-files-finder-deleter helped me get to rid of duplicates
Try Duplicate Files Deleter , it is very effective.
You can use this also Duplicate File Deleter The duplicate files are such that they my hard drives used to end up with bad sectors. Also, there was always a shortage of free space. I regularly use "Duplicate Files Deleter" when i feel that my computer is lagging. You should try it too and i'm sure that you won't regret it. It can delete all duplicate files eg.-music,EXE. and same name files etc.
Many users have complained about manual deletion feature. That’s not quite true, although it’s a fact that the deletion feature offered by the software is extremely tedious. However, the software offers the option of saving the results of duplicate search in txt and/or html format. In the exported file the files are groupped in units showing duplicates along with their respectful sizes. With a minimum competence in programmatically processing such file, you may establish virtually any method of deleting duplicates. I, for one, created a short VBA script and process all these reports in MS Access. And in order to manage huge number of duplicates I actually do not need any other duplicate finding software. Noteworthy is that the programme is capable of processing huge numbers of files (in my case more than a million) in one search and the files may be of any size (in my case up to > 1Gb). Processing speed depends on the capability of a given system. However, I find it to be a drawback of the software that I cannot find any information about command line options. To enter folders manually – one after another is a tedious job and I feel really disappointed that the command line option seems to have not been implemented in the software.
Very impressed. I was a little sceptical about how it worked so I tested it on similar files where just a single byte was changed and no more duplicates were found... so it doesn't just check on the file name alone, very important! After finding the duplicates, you're presented with a list and you have to tell it which files to keep and which to delete. If you have a lot of duplicates, it's a painful task but it's good because you don't want files removed from where they're supposed to be and keep the files that are somewhere well hidden on your system. It's almost impossible to automatically select that for you without being disappointed in the results... so ignore the bad reviews where people complain about this ;) It's losing 1 star for the documentation (maybe it's there, but hard to find). With proper docs, I wouldn't have needed to test it.