Is there any possibility to implement larger dictionary sizes 2GB or 4GB for example? It would be a tremendous help in improving the compression ratio for large archives containing virtual machines for example.
I frequently find that programs like exdupe or pcompress massively outperform 7zip in compression ratio and the do so in much less time, primarily due to their larger block sizes being able to find matches that are more than 1GB apart.
I have 32GB of RAM in my laptop and could handle above 2GB of dictionary size. How to enable that? What do I have to change in the source code in order to manage that? (mkay I admit I don't know how to compile it for Ubuntu). I imagine that it can have some use to compress some DVD ISOs with similar files below the dictionary size or virtual machines of the same operating systems (mostly fresh installed).
Mkay, my result on some fresh installed Windows virtual machines of totally 35GB:
5,6GB: 7za a -t7z -m0=lzma2 -y -mhe=on -mx=9 -mfb=1024 -ms=on -mmt=3 -md=1024m 1G.7z
4,3GB: exdupe -g8 volume/ g8.xdp
2,8GB: 7za a -t7z -m0=lzma2 -y -mhe=on -mx=9 -mfb=1024 -ms=on -mmt=3 -md=1024m g8_1024.7z g8.xdp
Compressing KDE partition manager partition images I had the result:
4 7zip-archives with 7-zip using 1gb dictionary size: 18GB
Whole content of that archives (72GB) compressed using exdupe: 14,7GB
Compressed the exdupe archive with 7zip with 1GB dictionary size: 11,2GB
Conclusion: I'll always use Exdupe aftercompressing that with 7zip after it when handling huge files.
Andreas, why exdupe? did you try zpaq or pcompress in place of exdupe? John
I plan to work for dictionary size increasing in next 2-4 weeks.
That's great to hear! I would be very happy to help test and debug any builds supporting larger dictionaries.
Has there been any progress on this?
There were another tasks.
So that feature will be not implemented in nearest version.
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.