-----BEGIN PGP SIGNED MESSAGE-----
Dr. Martin Senftleben wrote:
> Hi all,
> I love dar and use it regularly for my backups. These are huge, as I
> also back up my fotos - it comes to 1 TB right now and will become more.
> Incremental backups are a must, otherwise the PC would run continuously
> just to do the backup. It takes for the 1 TB about 3 days.
> I need to compress the data, otherwise storage space will not suffice.
> This slows things down obviously. I use bzip (option -y), and I have now
> tried to decrease the compression ratio by adding the level 7, but that
> doesn't seem to make much of a difference. How far down should I go, and
> what impact would that have on the backup size? Is there a visualisation
> of that, some kind of graph, so that I can get an idea?
I suspect that the compression ratio is strongly dependent on the data
to compress (trying to compress an already compressing file will not
give you much extra space left, whatever is the compression level used).
> Or do I have to try it out each time I do a backup?
I guess yes. If processing time is your main problem, you should maybe
use -y1 first (fastest compression for bzip2) then if it is fast enough
increase to -y2 to see if the gain in space worth the extra time spent
... and so on.
> I back up in slizes at 4.3 GB, even though I'm not going to put them on
> DVD. It's all stored on an external hard drive (E-SATA).
> I also wonder: is there a list of file types which cannot be compressed?
you can avoid compressing any file that is already compressed (*.bz2,
*.gz, *.zip, *.Z, *.dar, ...) Also including *.mp3 and *.avi files
(well, avi format is a container, the compression ratio can be good for
some codecs, while it can be very poor for others like divX).
> Because I noticed that those files which I exclude with the parameter -Z
> are stored much faster. The 4.3 GB slizes with such files are done
> within 2 to 10 minutes, while the others take about 30 minutes each,
> sometimes even longer. I would like to make sure that all files that
> cannot be compressed are also not handled by bzip.
> Are there any other ways to speed up dar, except for decreasing the
> compression ratio?
you can also tune avoid compressing files which size is under a given
threshold, the -m option set this threshold which is 100 bytes by
default. You may like to increase it to some KB as the gain compressing
short files of that size will at most be ... some KB * Number of such
files ... OK, all depends on the number of such small files and their
> My system is quite fast: AMD Athlon 64 X2 5200+ with 8 GB RAM 800.
> 64-bit system; OS is LinuxMint, updated to date.
As you noticed, compressing is one of the most CPU consuming operation
in dar. So far it relies on several external libraries (libz, libbz2,
etc.) that do not (yet) support multi-processor CPU. However, there is
work under progress in theses area ...
> Martin Senftleben
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
-----END PGP SIGNATURE-----