My Backup is every day 7GByte. So this script produces each time traffic of
7 Gbyte to write the sql file,
7 Gbyte to read the sql file and
1 Gbyte to write the compressed gzip file
It would be easier if someone could change the script, so that the sql is simply piped into gzip. So only 1 out of 15Gbyte has to be moved.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
My Backup is every day 7GByte. So this script produces each time traffic of
7 Gbyte to write the sql file,
7 Gbyte to read the sql file and
1 Gbyte to write the compressed gzip file
It would be easier if someone could change the script, so that the sql is simply piped into gzip. So only 1 out of 15Gbyte has to be moved.
You make a good point. I'll look and see how eazy that would be. You could always make the changes and submit a diff? smile.
I just check, It had done in last version (3.0-rc6).