Currently I have a work to compress something duplicated manytimes(ex.10000 times), and then duplicate again. then compress again….. untile it's not possible to compress again. The bigger the final file the better. The original file is smaller then 1MB in size. We use p7zip to do the compression in ubuntu 10.10. with a 64bit computer of 2G memory and 1T harddisk size. 8 CPUs. I made a swap file of 20G.
The question is: how can I make the final file to 2G or bigger? I know if I want to duplicate and compress the file to reach 2G, about 25G memory is needed. right? But I find I even can't set the dictionary size to 200M now. What should I do? p7zip command used: 7z a -t7z a.7z folder_name -md=200m -ms -m0=lzma
I used the latest version pzip of 9.13
Sign up for the SourceForge newsletter:
You seem to have CSS turned off.
Please don't fill out this field.