|
From: Alan D. <We...@Om...> - 2005-09-30 12:55:25
|
On 9/29/2005 9:07 PM, Jamie Cameron wrote: > On Fri, 2005-09-30 at 03:52, Alan Dobkin wrote: > >> As a possible resolution, perhaps each backup could be assigned a job >> number to identify it, and/or the selection list could other attributes, >> such as the dump level and the parent job's schedule. Or, better yet, >> it could use a move up/down interface similar to the rule configuration >> tables in the Linux Firewall module. >> > > How about if the list of dumps includes the destination and level as > well? That way you could more easily know which one is being selected.. > That would be helpful, but there could still be multiple jobs with the same destination and level scheduled at different times. For it to really be unique, I think it would have to display all three fields and/or a job number. The move up/down interface seems like a more elegant direction though, because it would be easier to display and modify the connections based on a parent/child relationship, and they would all display in the correct order in the summary table. But I'm sure the former is much easier to code than the latter, and it would still be functional. >> The other improvement that I would like to suggest is a way to retain >> multiple backups and auto-expire them after a given amount of time. For >> example, it is bad practice to overwrite a good backup with a second >> (possibly bad) backup. Instead, the second backup should be written to >> a second destination (file, tape, remote host, etc.), and then the first >> backup can be deleted after a set amount of time. This could be >> extended to retain any number of old backups before expiring/deleting >> them, similar to how the logrotate program keeps a certain number of old >> backlogs. In order to accomplish this now, I have to create multiple >> backup jobs for each repetition, which is very cumbersome. But again, I >> would welcome any suggestions if someone knows of a better way to do >> this given the current interface. >> > > I can't really make any suggestions here, apart from perhaps doing > backups with data-based filenames, and setting up a cron job to delete > backups older that a certain number of days. > > - Jamie I already set the filenames based on the directory that is being backed up, but how would that allow retention of multiple backups of the same directory? I guess I could create separate jobs for each day of the week, for example, and then they would automatically overwrite the file from the previous week. But setting that up for multiple directories would be cumbersome. For example, all of these jobs would have to be created separately: home-mon home-tues home-wed ... etc-mon etc-tues etc-wed ... same for var, usr, etc.... Is there an easier way to accomplish something like this, or am I asking too much of this module? Thanks, Alan |