|
From: Joe Z. <jz...@co...> - 2004-04-25 16:20:45
|
Rene Rask wrote:
>On Sun, 2004-04-25 at 07:18, Joe Zacky wrote:
>
>
>>Rene,
>>
>>I was going to add a routine to cleanup incremental backups but now I'm
>>not sure. I was planning to remove incremental files that were backed up
>>over X days ago, but now I see the field text is "How many incremental
>>backups to keep (0 = infinite)", which seems to mean how many copies of
>>an incremental file to keep. So if incremental is set to 9, then there
>>could be up to 9 copies of any incremental file. This means the
>>incremental backup directory could be up to 9 times as large as the
>>current backup.
>>
>>Here's some points to ponder:
>>
>>If we keep X copies of incrementals:
>>o The amount of storage required "could" be much more than if we kept
>>X days.
>>o It's going to be programmatically difficult and time consuming to
>>read through all the file names in incremental and count how many there
>>are of each.
>>
>>If we keep X days worth of incrementals:
>>o The only backup copy of a file would be deleted after X days if the
>>file hasn't been changed. That's probably not a good thing.
>>o Removing the incremental files is easily done with a command like this:
>> find /path/to/bobsdata/incremental/<server>/<share>/ -type f
>>-ctime +<days> -ctime -999 -exec rm -v {} ';'
>>
>>So I'm wondering 1) what your intention is for this field, 2) what are
>>you doing on your bobs system to cleanup incremental files, and 3) how
>>do you suggest I proceed?
>>
>>The question is what do we want our selection criteria to be: "how many"
>>or "how long?"
>>
>>
>>
>
>Hi Joe
>Here is my view on it.
>
>My idea was to have X days worth of files in the incremental store. That
>would allow to say that I want 90 days of incremental backups. In my
>situation, if a file hasn't changed for 90 days it probably done.
>
>
Here you're saying that X is "how long".
>I have no problem with not having a backup of a file in incrementals. If
>that is a problem I need to increase the timespan I keep the files and
>possibly increase my disk capacity if that is needed. That is not a BOBS
>problem, but a backup administrators problem.
>
>Thew way I delete files is to set some limits. Say files older than 6
>months AND larger than 500 MB ( and sometimes by type, like .avi, .mov,
>mpg and so on. Depends on the work being done.)
>Generally my tolerance towards large files is biased so they are deleted
>first.
>A 500MB file is seldom a "from scratch work" but something made from
>another file. Like a movie rendering. Exceptions are files like
>photoshop (.psd) files which can be large and original works.
>
>That is just my situation others probably have different scenarios.
>
>Another feature I like is to have bobs decide when incrementals are
>deleted. Say, when only 50 GB of diskspace is left, delete until there
>is 100 GB free, oldest files first. (I guess this will be a problem when
>having different time spans for the various backups.)
>
>Anyway. A simple "delete after X incremental backups" would be a good
>start. X not being days, since we can be sure the backup is run every
>day, requires a little date manipulation or some other system of telling
>when and what to delete.
>
>
Here you're saying X is "how many." I'm confused, should X be number of
days or number of files?
>Please used the database(s) to search for files to delete. The time on
>disk is not accurate. I think it reflects the time the file was last
>changed (on the originating server).
>The database has the correct information by using the date tagging
>system.
>I used the find command when deleting but I know it isn't the correct
>way to do it.
>
>
The tests I ran on 2 of my redhat systems showed that using find with
-ctime picked the dates the file was backed up. That is, not the date it
was last modified (-mtime), but the date it was rsync'd to bobs.
I didn't realize that information was in the database. I never did
anything with the database so I wasn't thinking about it. I'll have to
play with that - I agree that sounds like the right way to find the files.
>
>Cheers
>Rene
>
>
>
|