Menu

Optimize archive for decompressing large archive of many small files

2015-08-28
2015-08-28
  • Michael Fraser

    Michael Fraser - 2015-08-28

    Hi Igor et. al.

    Just looking for some input on the following scenario:

    I am using 7zip to archive up a directory structure of about 20,000 files, all ranging from about 2 - 15Mb each. The total size of the archive is about 42 Gb compressed (split over 11 volumes).

    Unpacked, the data is about 45Gb so it would seem that the data is already well compressed. I'm just wondering what the best way would be to archive the directory structure to minimize the time taken to extract it?

    I saw solid blocks mentioned before as a way to tune for memory usage, would that also affect time performance?

    Thanks in advance for any assistance you can provide.

    Michael

     
  • Igor Pavlov

    Igor Pavlov - 2015-08-28

    Just use 7z/LZMA2/solid in latest 7-Zip 15.06 beta.

     

Log in to post a comment.