I've been using the software for a few months, and it's been working really well.... Unfortunately, I've got to the point where the amount of data that I'm backing up (~550GB) takes more than 24 hours to compress and write to the backup medium.
Is there any way that the program can check first whether it is already running, and error to prevent the next day's session from running.....?
The workaround I've got until then is to force the program to back up less frequently by calling once a week from cron manually, rather than having the default functioning, but ideally I'd want it to, say, skip the tuesday processing as it was already running, then just do --newer for the rest of the week, until next monday when it runs completely again.
Apart from that, thanks for a wonderful programme - it really does do what it says on the tin!
Regards
Chris
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Yes I have a similar problem. The amount of data I want to backup is very large (it's on an 3TB raid array and right now uncompressed about 1.2TB). Luckily the amount of changes is usually rather small.
Long story, short question I would be interested in changing the script in such a way that it ONLY produces incremental backups, or alternativly change the complete backupscedule into like once every 3 days, and full backup only monthly...
P.S.
I haven't been using the backupsript so far, but I rather did some plain simple manual backup with splitted tar.gz files, which I then just write on a external harddrive.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I will consider this for the future. However, SLB was not meant to back up that much data (see the web site, http://simplelinuxbkup.sourceforge.net/, where it says it has not been tested with very large backups). SLB was meant as a simple backup tool for non-technical users to back up their important data (e.g., documents and the like), not to back up huge amounts of data or entire systems. In fact, the tar.gz format used by SLB is terrible for extremely large backups.
Thanks for the feedback, though. I will consider some simple changes in a future release which may not completely fix the problem but may help.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hiya....
I've been using the software for a few months, and it's been working really well.... Unfortunately, I've got to the point where the amount of data that I'm backing up (~550GB) takes more than 24 hours to compress and write to the backup medium.
Is there any way that the program can check first whether it is already running, and error to prevent the next day's session from running.....?
The workaround I've got until then is to force the program to back up less frequently by calling once a week from cron manually, rather than having the default functioning, but ideally I'd want it to, say, skip the tuesday processing as it was already running, then just do --newer for the rest of the week, until next monday when it runs completely again.
Apart from that, thanks for a wonderful programme - it really does do what it says on the tin!
Regards
Chris
Yes I have a similar problem. The amount of data I want to backup is very large (it's on an 3TB raid array and right now uncompressed about 1.2TB). Luckily the amount of changes is usually rather small.
Long story, short question I would be interested in changing the script in such a way that it ONLY produces incremental backups, or alternativly change the complete backupscedule into like once every 3 days, and full backup only monthly...
P.S.
I haven't been using the backupsript so far, but I rather did some plain simple manual backup with splitted tar.gz files, which I then just write on a external harddrive.
I will consider this for the future. However, SLB was not meant to back up that much data (see the web site, http://simplelinuxbkup.sourceforge.net/, where it says it has not been tested with very large backups). SLB was meant as a simple backup tool for non-technical users to back up their important data (e.g., documents and the like), not to back up huge amounts of data or entire systems. In fact, the tar.gz format used by SLB is terrible for extremely large backups.
Thanks for the feedback, though. I will consider some simple changes in a future release which may not completely fix the problem but may help.