From: Matt <lan...@gm...> - 2006-07-31 23:41:02
|
Hi, I am using the rsyncd transfer method and would like to have each new backup be base to the most recent incremental, thus completely avoiding full backup after the first initial backup. Is this now possible with version 3.0.0 beta? Maybe $Conf{IncrLevels} can be use to this end but I don't see how. Regards, ... Matt |
From: Craig B. <cba...@us...> - 2006-08-01 05:37:31
|
Matt writes: > I am using the rsyncd transfer method and would like to have each new b= ackup > be base to the most recent incremental, thus completely avoiding full > backup after the first initial backup. >=20 > Is this now possible with version 3.0.0 beta? Maybe $Conf{IncrLevels}= > can be use to this end but I don't see how. Unfortunately that won't work yet, even in 3.0.0. In theory you could set $Conf{IncrLevels} to a long, incrementing, sequence, and set $Conf{FullPeriod} to be large too. However, for each new backup, the full and all the incrementals have to be merged together to get the most recent backup filled. That's used as a reference for the next incremental. The problem is that takes more and more time for each backup (ie: after a week every directory on the client requires 7 directory reads on the server; after a month it is 30). Also, no backups can be expired. What's required is one more feature: filling in incrementals as older backups are expired. That way old backups can be deleted but a "filled" backup is kept so that more recent backups can be reconstructed. Note that with rsync a full backup (after the first) doesn't involve a lot more network traffic than an incremental. In 3.0.0 the most recent (merged) backup is used as a reference (rather than the last full in 2.x) so files changed after the last full but before the most recent incremental won't be transferred again. The other reason to do a full is that the actual file contents are checked. Incrementals just check meta data. Bottom line: you still should do periodic fulls. Craig |
From: Rob M. <ro...@di...> - 2006-08-01 12:33:28
|
Hello all , i am new to this list.... I was looking around the archives for a method of backing up the localhost. I could not find anything for a newbie.... i did come across a small email about using a conf file named localhost.pl that was suppose to be in the package, however i did not see it in my tarball... Can some one suggest hat i should do or point me to some docs... Thanks Have a great day! Rob Morin Dido InterNet Inc. Montreal, Canada Http://www.dido.ca 514-990-4444 Matt wrote: > Hi, > > I am using the rsyncd transfer method and would like to have each new backup > be base to the most recent incremental, thus completely avoiding full > backup > after the first initial backup. > > Is this now possible with version 3.0.0 beta? Maybe $Conf{IncrLevels} > can be > use to this end but I don't see how. > > Regards, > > ... Matt > > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share your > opinions on IT & business topics through brief surveys -- and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > _______________________________________________ > BackupPC-users mailing list > Bac...@li... > https://lists.sourceforge.net/lists/listinfo/backuppc-users > http://backuppc.sourceforge.net/ > > > |
From: Les M. <le...@fu...> - 2006-08-01 13:45:14
|
On Tue, 2006-08-01 at 07:33, Rob Morin wrote: > I was looking around the archives for a method of backing up the > localhost. I could not find anything for a newbie.... i did come across > a small email about using a conf file named localhost.pl that was > suppose to be in the package, however i did not see it in my tarball... > > Can some one suggest hat i should do or point me to some docs... There is some extra overhead in running ssh back to the local machine like you would any other but it usually doesn't matter. As long as the backups are done the next morning I wouldn't bother making it a special case. -- Les Mikesell le...@fu... |
From: Randy B. <ra...@el...> - 2006-08-01 14:35:06
|
Les Mikesell wrote: > There is some extra overhead in running ssh back to the local > machine like you would any other but it usually doesn't matter. > As long as the backups are done the next morning I wouldn't > bother making it a special case. This actually depends on how old your machine is. I run an older PIII, and the ssh overhead was enough to cause each backup to take a really long time. I would recommend setting up sudo permissions for the backuppc user and changing the local backup command to use sudo instead of ssh... R |
From: Rob M. <ro...@di...> - 2006-08-01 20:21:11
|
Thanks! I figured once the first pass is done the rest will not be as bad... i will wait and see what happens, however i will create this handy conf file anyways.... Thanks for all the replies! Rob Morin Dido InterNet Inc. Montreal, Canada Http://www.dido.ca 514-990-4444 daniel berteaud wrote: > Hello, I worked on the integration of backuppc to SME server > distribution. I use it to save several host including the localhost. > For the localhost, I use this per pc config: > > ##### start of the per pc config ##### > > $Conf{TarShareName} = ['/']; > > $Conf{BackupFilesExclude}=['/proc','/sys','/dev','/tmp','/home/e-smith/files/ibays/backup']; > > $Conf{XferMethod} = 'tar'; > > $Conf{TarClientCmd} = '/usr/bin/sudo' > . ' $tarPath -c -v -f - -C $shareName' > . ' --totals'; > > $Conf{TarFullArgs} = '$fileList'; > > $Conf{CompressLevel} = 3; > > $Conf{TarIncrArgs} = '--newer=$incrDate $fileList'; > > $Conf{TarClientRestoreCmd} = '/usr/bin/sudo' > . ' $tarPath -x -p --numeric-owner --same-owner' > . ' -v -f - -C $shareName+'; > > ##### end of the per pc config ##### > > For this to work, you need to allow the backuppc user to run tar with > sudo. I use this line in /etc/sudoers: > > backuppc ALL=(root) NOPASSWD:/bin/tar > > I hope this can help. > > > On Tue, 01 Aug 2006 08:33:19 -0400 > Rob Morin <ro...@di...> wrote: > > >> Hello all , i am new to this list.... >> >> I was looking around the archives for a method of backing up the >> localhost. I could not find anything for a newbie.... i did come >> across a small email about using a conf file named localhost.pl that >> was suppose to be in the package, however i did not see it in my >> tarball... >> >> Can some one suggest hat i should do or point me to some docs... >> >> Thanks >> >> Have a great day! >> >> Rob Morin >> Dido InterNet Inc. >> Montreal, Canada >> Http://www.dido.ca >> 514-990-4444 >> >> >> >> Matt wrote: >> >>> Hi, >>> >>> I am using the rsyncd transfer method and would like to have each >>> new backup be base to the most recent incremental, thus completely >>> avoiding full backup >>> after the first initial backup. >>> >>> Is this now possible with version 3.0.0 beta? Maybe >>> $Conf{IncrLevels} can be >>> use to this end but I don't see how. >>> >>> Regards, >>> >>> ... Matt >>> >>> ------------------------------------------------------------------------- >>> Take Surveys. Earn Cash. Influence the Future of IT >>> Join SourceForge.net's Techsay panel and you'll get the chance to >>> share your opinions on IT & business topics through brief surveys >>> -- and earn cash >>> http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV >>> _______________________________________________ BackupPC-users >>> mailing list Bac...@li... >>> https://lists.sourceforge.net/lists/listinfo/backuppc-users >>> http://backuppc.sourceforge.net/ >>> >>> >>> >>> >> ------------------------------------------------------------------------- >> Take Surveys. Earn Cash. Influence the Future of IT >> Join SourceForge.net's Techsay panel and you'll get the chance to >> share your opinions on IT & business topics through brief surveys -- >> and earn cash >> http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV >> _______________________________________________ BackupPC-users >> mailing list Bac...@li... >> https://lists.sourceforge.net/lists/listinfo/backuppc-users >> http://backuppc.sourceforge.net/ >> >> > > > |
From: Rob M. <ro...@di...> - 2006-08-02 14:44:23
|
I had to use the user www-data as the backup user , because there is a problem for me to use sperl on all my debian systems for some reason.... Now if i add www-data to sudoers( which my apache runs as) does this introduce a security issue? Thanks... Rob Morin Dido InterNet Inc. Montreal, Canada Http://www.dido.ca 514-990-4444 daniel berteaud wrote: > Hello, I worked on the integration of backuppc to SME server > distribution. I use it to save several host including the localhost. > For the localhost, I use this per pc config: > > ##### start of the per pc config ##### > > $Conf{TarShareName} = ['/']; > > $Conf{BackupFilesExclude}=['/proc','/sys','/dev','/tmp','/home/e-smith/files/ibays/backup']; > > $Conf{XferMethod} = 'tar'; > > $Conf{TarClientCmd} = '/usr/bin/sudo' > . ' $tarPath -c -v -f - -C $shareName' > . ' --totals'; > > $Conf{TarFullArgs} = '$fileList'; > > $Conf{CompressLevel} = 3; > > $Conf{TarIncrArgs} = '--newer=$incrDate $fileList'; > > $Conf{TarClientRestoreCmd} = '/usr/bin/sudo' > . ' $tarPath -x -p --numeric-owner --same-owner' > . ' -v -f - -C $shareName+'; > > ##### end of the per pc config ##### > > For this to work, you need to allow the backuppc user to run tar with > sudo. I use this line in /etc/sudoers: > > backuppc ALL=(root) NOPASSWD:/bin/tar > > I hope this can help. > > > On Tue, 01 Aug 2006 08:33:19 -0400 > Rob Morin <ro...@di...> wrote: > > >> Hello all , i am new to this list.... >> >> I was looking around the archives for a method of backing up the >> localhost. I could not find anything for a newbie.... i did come >> across a small email about using a conf file named localhost.pl that >> was suppose to be in the package, however i did not see it in my >> tarball... >> >> Can some one suggest hat i should do or point me to some docs... >> >> Thanks >> >> Have a great day! >> >> Rob Morin >> Dido InterNet Inc. >> Montreal, Canada >> Http://www.dido.ca >> 514-990-4444 >> >> >> >> Matt wrote: >> >>> Hi, >>> >>> I am using the rsyncd transfer method and would like to have each >>> new backup be base to the most recent incremental, thus completely >>> avoiding full backup >>> after the first initial backup. >>> >>> Is this now possible with version 3.0.0 beta? Maybe >>> $Conf{IncrLevels} can be >>> use to this end but I don't see how. >>> >>> Regards, >>> >>> ... Matt >>> >>> ------------------------------------------------------------------------- >>> Take Surveys. Earn Cash. Influence the Future of IT >>> Join SourceForge.net's Techsay panel and you'll get the chance to >>> share your opinions on IT & business topics through brief surveys >>> -- and earn cash >>> http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV >>> _______________________________________________ BackupPC-users >>> mailing list Bac...@li... >>> https://lists.sourceforge.net/lists/listinfo/backuppc-users >>> http://backuppc.sourceforge.net/ >>> >>> >>> >>> >> ------------------------------------------------------------------------- >> Take Surveys. Earn Cash. Influence the Future of IT >> Join SourceForge.net's Techsay panel and you'll get the chance to >> share your opinions on IT & business topics through brief surveys -- >> and earn cash >> http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV >> _______________________________________________ BackupPC-users >> mailing list Bac...@li... >> https://lists.sourceforge.net/lists/listinfo/backuppc-users >> http://backuppc.sourceforge.net/ >> >> > > > |
From: daniel b. <da...@fi...> - 2006-08-02 15:21:12
|
On Wed, 02 Aug 2006 10:44:18 -0400 Rob Morin <ro...@di...> wrote: Well I'm quite new in linux and maybe others users will tell you you shouldn't give www-data a sudo permission. Anyway, If you wan't backuppc to backup your localhost, the user how runs backuppc needs root permissions to reach some system files. I don't think adding these lines to sudoers introduce many security risk because you give root permission only for the /bin/tar program (or the /bin/rsync). Of course it would be better to run backuppc as a special user. If you have problem using sperl, I saw on that you can run a special instance of apache under another user, specially for backuppc. Maybe that would be more secure.=20 > I had to use the user www-data as the backup user , because there is > a problem for me to use sperl on all my debian systems for some > reason.... Now if i add www-data to sudoers( which my apache runs as) > does this introduce a security issue? >=20 > Thanks... >=20 > Rob Morin > Dido InterNet Inc. > Montreal, Canada > Http://www.dido.ca > 514-990-4444 >=20 >=20 >=20 > daniel berteaud wrote: > > Hello, I worked on the integration of backuppc to SME server > > distribution. I use it to save several host including the localhost. > > For the localhost, I use this per pc config: > > > > ##### start of the per pc config ##### > > > > $Conf{TarShareName} =3D ['/']; > > > > $Conf{BackupFilesExclude}=3D['/proc','/sys','/dev','/tmp','/home/e-smit= h/files/ibays/backup']; > > > > $Conf{XferMethod} =3D 'tar'; > > > > $Conf{TarClientCmd} =3D '/usr/bin/sudo' > > . ' $tarPath -c -v -f - -C $shareName' > > . ' --totals'; > > > > $Conf{TarFullArgs} =3D '$fileList'; > > > > $Conf{CompressLevel} =3D 3; > > > > $Conf{TarIncrArgs} =3D '--newer=3D$incrDate $fileList'; > > > > $Conf{TarClientRestoreCmd} =3D '/usr/bin/sudo' > > . ' $tarPath -x -p --numeric-owner --same-owner' > > . ' -v -f - -C $shareName+'; > > > > ##### end of the per pc config ##### > > > > For this to work, you need to allow the backuppc user to run tar > > with sudo. I use this line in /etc/sudoers: > > > > backuppc ALL=3D(root) NOPASSWD:/bin/tar > > > > I hope this can help. > > > > > > On Tue, 01 Aug 2006 08:33:19 -0400 > > Rob Morin <ro...@di...> wrote: > > > > =20 > >> Hello all , i am new to this list.... > >> > >> I was looking around the archives for a method of backing up the=20 > >> localhost. I could not find anything for a newbie.... i did come > >> across a small email about using a conf file named localhost.pl > >> that was suppose to be in the package, however i did not see it in > >> my tarball... > >> > >> Can some one suggest hat i should do or point me to some docs... > >> > >> Thanks > >> > >> Have a great day! > >> > >> Rob Morin > >> Dido InterNet Inc. > >> Montreal, Canada > >> Http://www.dido.ca > >> 514-990-4444 > >> --=20 Daniel Berteaud FIREWALL-SERVICES SARL. Soci=E9t=E9 de Services en Logiciels Libres Technop=F4le Montesquieu 33650 MARTILLAC Tel : 05 56 64 15 32 Fax : 05 56 64 82 05 Mail: daniel@Firewall-Services.com Web : http://www.firewall-services.com |
From: Rob M. <ro...@di...> - 2006-08-03 15:30:06
|
Another question, why does the backup seem to start at 11am? is there a place to alter this..... my server(localhost) is now backing up itself and the load is now at 6.00, this makes other services slow on the machine.... But it should do just an incremental backup now , right since it did its first backup yesterday? Any help appreciated.... Thanks.. Rob Morin Dido InterNet Inc. Montreal, Canada Http://www.dido.ca 514-990-4444 daniel berteaud wrote: > On Wed, 02 Aug 2006 10:44:18 -0400 > Rob Morin <ro...@di...> wrote: > > > Well I'm quite new in linux and maybe others users will tell you you > shouldn't give www-data a sudo permission. Anyway, If you wan't > backuppc to backup your localhost, the user how runs backuppc needs root > permissions to reach some system files. > > I don't think adding these lines to sudoers introduce many security > risk because you give root permission only for the /bin/tar program (or > the /bin/rsync). > > Of course it would be better to run backuppc as a special user. If you > have problem using sperl, I saw on that you can run a special instance > of apache under another user, specially for backuppc. Maybe that would > be more secure. > > > > > >> I had to use the user www-data as the backup user , because there is >> a problem for me to use sperl on all my debian systems for some >> reason.... Now if i add www-data to sudoers( which my apache runs as) >> does this introduce a security issue? >> >> Thanks... >> >> Rob Morin >> Dido InterNet Inc. >> Montreal, Canada >> Http://www.dido.ca >> 514-990-4444 >> >> >> >> daniel berteaud wrote: >> >>> Hello, I worked on the integration of backuppc to SME server >>> distribution. I use it to save several host including the localhost. >>> For the localhost, I use this per pc config: >>> >>> ##### start of the per pc config ##### >>> >>> $Conf{TarShareName} = ['/']; >>> >>> $Conf{BackupFilesExclude}=['/proc','/sys','/dev','/tmp','/home/e-smith/files/ibays/backup']; >>> >>> $Conf{XferMethod} = 'tar'; >>> >>> $Conf{TarClientCmd} = '/usr/bin/sudo' >>> . ' $tarPath -c -v -f - -C $shareName' >>> . ' --totals'; >>> >>> $Conf{TarFullArgs} = '$fileList'; >>> >>> $Conf{CompressLevel} = 3; >>> >>> $Conf{TarIncrArgs} = '--newer=$incrDate $fileList'; >>> >>> $Conf{TarClientRestoreCmd} = '/usr/bin/sudo' >>> . ' $tarPath -x -p --numeric-owner --same-owner' >>> . ' -v -f - -C $shareName+'; >>> >>> ##### end of the per pc config ##### >>> >>> For this to work, you need to allow the backuppc user to run tar >>> with sudo. I use this line in /etc/sudoers: >>> >>> backuppc ALL=(root) NOPASSWD:/bin/tar >>> >>> I hope this can help. >>> >>> >>> On Tue, 01 Aug 2006 08:33:19 -0400 >>> Rob Morin <ro...@di...> wrote: >>> >>> >>> >>>> Hello all , i am new to this list.... >>>> >>>> I was looking around the archives for a method of backing up the >>>> localhost. I could not find anything for a newbie.... i did come >>>> across a small email about using a conf file named localhost.pl >>>> that was suppose to be in the package, however i did not see it in >>>> my tarball... >>>> >>>> Can some one suggest hat i should do or point me to some docs... >>>> >>>> Thanks >>>> >>>> Have a great day! >>>> >>>> Rob Morin >>>> Dido InterNet Inc. >>>> Montreal, Canada >>>> Http://www.dido.ca >>>> 514-990-4444 >>>> >>>> > > > |
From: daniel b. <da...@fi...> - 2006-08-03 16:03:12
|
Well, it should do incremental backups, depending on the configuration. You can verify the type of each backup in the cgi interface. You can specify in the per pc configuration file the blackout periods so that backups only occure when you want them to occure. For exemple (this is the default configuration) $Conf{BlackoutPeriods} = [ { hourBegin => 7.0, hourEnd => 19.5, weekDays => [1, 2, 3, 4, 5], }, ]; With this the backups won't occure (if the host is always connected, but, as we talk about the localhost, it will always be connected) from 7h00 to 19h30 during all the week (from monday to friday). You can define several blackout period. On Thu, 03 Aug 2006 11:29:57 -0400 Rob Morin <ro...@di...> wrote: > Another question, why does the backup seem to start at 11am? is there > a place to alter this..... my server(localhost) is now backing up > itself and the load is now at 6.00, this makes other services slow on > the machine.... > > But it should do just an incremental backup now , right since it did > its first backup yesterday? > > Any help appreciated.... > > Thanks.. > > Rob Morin > Dido InterNet Inc. > Montreal, Canada > Http://www.dido.ca > 514-990-4444 > > > > daniel berteaud wrote: > > On Wed, 02 Aug 2006 10:44:18 -0400 > > Rob Morin <ro...@di...> wrote: > > > > > > Well I'm quite new in linux and maybe others users will tell you you > > shouldn't give www-data a sudo permission. Anyway, If you wan't > > backuppc to backup your localhost, the user how runs backuppc needs > > root permissions to reach some system files. > > > > I don't think adding these lines to sudoers introduce many security > > risk because you give root permission only for the /bin/tar program > > (or the /bin/rsync). > > > > Of course it would be better to run backuppc as a special user. If > > you have problem using sperl, I saw on that you can run a special > > instance of apache under another user, specially for backuppc. > > Maybe that would be more secure. > > > > > > > > > > > >> I had to use the user www-data as the backup user , because there > >> is a problem for me to use sperl on all my debian systems for some > >> reason.... Now if i add www-data to sudoers( which my apache runs > >> as) does this introduce a security issue? > >> > >> Thanks... > >> > >> Rob Morin > >> Dido InterNet Inc. > >> Montreal, Canada > >> Http://www.dido.ca > >> 514-990-4444 > >> |
From: Les M. <le...@fu...> - 2006-08-03 17:36:56
|
On Thu, 2006-08-03 at 18:01 +0200, daniel berteaud wrote: > Well, it should do incremental backups, depending on the > configuration. You can verify the type of each backup in the cgi > interface. You can specify in the per pc configuration file the > blackout periods so that backups only occure when you want them to > occure. For exemple (this is the default configuration) > > $Conf{BlackoutPeriods} = [ > { > hourBegin => 7.0, > hourEnd => 19.5, > weekDays => [1, 2, 3, 4, 5], > }, > ]; > > With this the backups won't occure (if the host is always connected, > but, as we talk about the localhost, it will always be connected) from > 7h00 to 19h30 during all the week (from monday to friday). You can > define several blackout period. To be considered 'always connected', some number of pings have to succeed outside of the blackout period. I usually start the initial backup of a new machine late in the day. Subsequent automatic runs won't start until 24 hours have passed, then later after the target is determined to be always available the runs won't start until the blackout period. If you force an incremental just before you leave work you'll shift into that cycle. -- Les Mikesell les...@gm... |
From: Rob M. <ro...@di...> - 2006-08-03 18:01:30
|
Ahh i see ok so if i force an incremental for all machines at, say midnight or 11pm they will always backup at that time then? ok cool... Another quick question, rather than do the rsync thing over ssh, i want to use a conf file someone on the list provided me to be placed in the pc's name dir... IE pc/localhost, now do i name the file config.pl or the name of the directory/pc IE localhost.pl as i forst named it localhost.pl and it still used ssh rather than tar Thanks dude! Rob Morin Dido InterNet Inc. Montreal, Canada Http://www.dido.ca 514-990-4444 Les Mikesell wrote: > On Thu, 2006-08-03 at 18:01 +0200, daniel berteaud wrote: > >> Well, it should do incremental backups, depending on the >> configuration. You can verify the type of each backup in the cgi >> interface. You can specify in the per pc configuration file the >> blackout periods so that backups only occure when you want them to >> occure. For exemple (this is the default configuration) >> >> $Conf{BlackoutPeriods} = [ >> { >> hourBegin => 7.0, >> hourEnd => 19.5, >> weekDays => [1, 2, 3, 4, 5], >> }, >> ]; >> >> With this the backups won't occure (if the host is always connected, >> but, as we talk about the localhost, it will always be connected) from >> 7h00 to 19h30 during all the week (from monday to friday). You can >> define several blackout period. >> > > To be considered 'always connected', some number of pings have to > succeed outside of the blackout period. I usually start the initial > backup of a new machine late in the day. Subsequent automatic runs > won't start until 24 hours have passed, then later after the target > is determined to be always available the runs won't start until the > blackout period. If you force an incremental just before you leave > work you'll shift into that cycle. > > |
From: Rob M. <ro...@di...> - 2006-08-03 18:06:53
|
By the way the backup took 12 hours to do, meanwhile the load was over 7 all the time.... Woa! that was for 65 gigs Rob Morin Dido InterNet Inc. Montreal, Canada Http://www.dido.ca 514-990-4444 Rob Morin wrote: > Ahh i see ok so if i force an incremental for all machines at, say > midnight or 11pm they will always backup at that time then? ok cool... > > Another quick question, rather than do the rsync thing over ssh, i want > to use a conf file someone on the list provided me to be placed in the > pc's name dir... IE pc/localhost, now do i name the file config.pl or > the name of the directory/pc IE localhost.pl as i forst named it > localhost.pl and it still used ssh rather than tar > > Thanks dude! > > Rob Morin > Dido InterNet Inc. > Montreal, Canada > Http://www.dido.ca > 514-990-4444 > > > > Les Mikesell wrote: > >> On Thu, 2006-08-03 at 18:01 +0200, daniel berteaud wrote: >> >> >>> Well, it should do incremental backups, depending on the >>> configuration. You can verify the type of each backup in the cgi >>> interface. You can specify in the per pc configuration file the >>> blackout periods so that backups only occure when you want them to >>> occure. For exemple (this is the default configuration) >>> >>> $Conf{BlackoutPeriods} = [ >>> { >>> hourBegin => 7.0, >>> hourEnd => 19.5, >>> weekDays => [1, 2, 3, 4, 5], >>> }, >>> ]; >>> >>> With this the backups won't occure (if the host is always connected, >>> but, as we talk about the localhost, it will always be connected) from >>> 7h00 to 19h30 during all the week (from monday to friday). You can >>> define several blackout period. >>> >>> >> To be considered 'always connected', some number of pings have to >> succeed outside of the blackout period. I usually start the initial >> backup of a new machine late in the day. Subsequent automatic runs >> won't start until 24 hours have passed, then later after the target >> is determined to be always available the runs won't start until the >> blackout period. If you force an incremental just before you leave >> work you'll shift into that cycle. >> >> >> > > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share your > opinions on IT & business topics through brief surveys -- and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > _______________________________________________ > BackupPC-users mailing list > Bac...@li... > https://lists.sourceforge.net/lists/listinfo/backuppc-users > http://backuppc.sourceforge.net/ > > > |
From: Les M. <le...@fu...> - 2006-08-03 20:14:24
|
On Thu, 2006-08-03 at 14:00 -0400, Rob Morin wrote: > Ahh i see ok so if i force an incremental for all machines at, say > midnight or 11pm they will always backup at that time then? ok cool... The next one won't start until about 24 hours has elapsed since the last run - then some other things could defer it. > Another quick question, rather than do the rsync thing over ssh, i want > to use a conf file someone on the list provided me to be placed in the > pc's name dir... IE pc/localhost, now do i name the file config.pl or > the name of the directory/pc IE localhost.pl as i forst named it > localhost.pl and it still used ssh rather than tar Naming it config.pl in the pc/hostname directory will work. This file only needs to contain the settings that differ from the global config.pl file. -- Les Mikesell le...@fu... |
From: Rob M. <ro...@di...> - 2006-08-04 13:09:59
|
How can i tell if it used the config.pl in te pc folder or the main one? It seemed to have work ok last night but not sure if it used the new config or the main one? the new config uses this for backing up the localhost ##### start of the per pc config ##### $Conf{TarShareName} = ['/']; $Conf{BackupFilesExclude}=['/proc','/sys','/dev','/tmp','/var/log', '/mnt/usb', '/data/var/log', '/data/var/spool', '/data/BACKUP']; $Conf{XferMethod} = 'tar'; $Conf{TarClientCmd} = '/usr/bin/sudo' . ' $tarPath -c -v -f - -C $shareName' . ' --totals'; $Conf{TarFullArgs} = '$fileList'; $Conf{CompressLevel} = 3; $Conf{TarIncrArgs} = '--newer=$incrDate $fileList'; $Conf{TarClientRestoreCmd} = '/usr/bin/sudo' . ' $tarPath -x -p --numeric-owner --same-owner' . ' -v -f - -C $shareName+'; ##### end of the per pc config ##### Thanks.. Rob Morin Dido InterNet Inc. Montreal, Canada Http://www.dido.ca 514-990-4444 Les Mikesell wrote: > On Thu, 2006-08-03 at 14:00 -0400, Rob Morin wrote: > >> Ahh i see ok so if i force an incremental for all machines at, say >> midnight or 11pm they will always backup at that time then? ok cool... >> > > The next one won't start until about 24 hours has elapsed since > the last run - then some other things could defer it. > > >> Another quick question, rather than do the rsync thing over ssh, i want >> to use a conf file someone on the list provided me to be placed in the >> pc's name dir... IE pc/localhost, now do i name the file config.pl or >> the name of the directory/pc IE localhost.pl as i forst named it >> localhost.pl and it still used ssh rather than tar >> > > Naming it config.pl in the pc/hostname directory will work. This file > only > needs to contain the settings that differ from the global config.pl > file. > > |
From: daniel b. <da...@fi...> - 2006-08-04 13:41:10
|
The main config is always used but you can define in the per pc config file some parameters that overload (I don't if you say this way in english, I'm french :/) the same parameters of the main config.pl file. For exemple, with your per pc config, all the main variables will be used except these ones which will be read from the per pc configuration file: $Conf{TarShareName} $Conf{XferMethod} $Conf{TarClientCmd} $Conf{TarFullArgs} $Conf{TarFullArgs} $Conf{CompressLevel} $Conf{TarIncrArgs} $Conf{TarClientRestoreCmd} On Fri, 04 Aug 2006 09:09:55 -0400 Rob Morin <ro...@di...> wrote: > How can i tell if it used the config.pl in te pc folder or the main > one? >=20 > It seemed to have work ok last night but not sure if it used the new=20 > config or the main one? the new config uses this for backing up the=20 > localhost >=20 > ##### start of the per pc config ##### >=20 > $Conf{TarShareName} =3D ['/']; >=20 > $Conf{BackupFilesExclude}=3D['/proc','/sys','/dev','/tmp','/var/log',=20 > '/mnt/usb', '/data/var/log', '/data/var/spool', '/data/BACKUP']; >=20 > $Conf{XferMethod} =3D 'tar'; >=20 > $Conf{TarClientCmd} =3D '/usr/bin/sudo' > . ' $tarPath -c -v -f - -C $shareName' > . ' --totals'; >=20 > $Conf{TarFullArgs} =3D '$fileList'; >=20 > $Conf{CompressLevel} =3D 3; >=20 > $Conf{TarIncrArgs} =3D '--newer=3D$incrDate $fileList'; >=20 > $Conf{TarClientRestoreCmd} =3D '/usr/bin/sudo' > . ' $tarPath -x -p --numeric-owner --same-owner' > . ' -v -f - -C $shareName+'; >=20 > ##### end of the per pc config ##### >=20 > Thanks.. >=20 > Rob Morin > Dido InterNet Inc. > Montreal, Canada > Http://www.dido.ca > 514-990-4444 >=20 >=20 >=20 > Les Mikesell wrote: > > On Thu, 2006-08-03 at 14:00 -0400, Rob Morin wrote: > > =20 > >> Ahh i see ok so if i force an incremental for all machines at, say=20 > >> midnight or 11pm they will always backup at that time then? ok > >> cool...=20 > > > > The next one won't start until about 24 hours has elapsed since > > the last run - then some other things could defer it. > > > > =20 > >> Another quick question, rather than do the rsync thing over ssh, > >> i want to use a conf file someone on the list provided me to be > >> placed in the pc's name dir... IE pc/localhost, now do i name > >> the file config.pl or the name of the directory/pc IE localhost.pl > >> as i forst named it localhost.pl and it still used ssh rather than > >> tar=20 > > > > Naming it config.pl in the pc/hostname directory will work. This > > file only > > needs to contain the settings that differ from the global config.pl > > file. > > > > =20 >=20 > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to > share your opinions on IT & business topics through brief surveys -- > and earn cash > http://www.techsay.com/default.php?page=3Djoin.php&p=3Dsourceforge&CID=3D= DEVDEV > _______________________________________________ BackupPC-users > mailing list Bac...@li... > https://lists.sourceforge.net/lists/listinfo/backuppc-users > http://backuppc.sourceforge.net/ >=20 --=20 Daniel Berteaud FIREWALL-SERVICES SARL. Soci=E9t=E9 de Services en Logiciels Libres Technop=F4le Montesquieu 33650 MARTILLAC Tel : 05 56 64 15 32 Fax : 05 56 64 82 05 Mail: daniel@Firewall-Services.com Web : http://www.firewall-services.com |
From: Les M. <le...@fu...> - 2006-08-04 13:39:46
|
On Fri, 2006-08-04 at 08:09, Rob Morin wrote: > How can i tell if it used the config.pl in te pc folder or the main one? > > It seemed to have work ok last night but not sure if it used the new > config or the main one? the new config uses this for backing up the > localhost If you view the XferLOG in the web interface (under the backup summary list) it will show the command used. More generically you can see the last time a file was accessed with ls -lu filename, so if nothing else read it after the run you might tell from the timestamp - unless the backup includes it. -- Les Mikesell le...@fu... |
From: Rob M. <ro...@di...> - 2006-08-04 13:43:37
|
Ok cool thanks, i forgot that that log was there.... i see this , so i guess its using tar! File /mnt/usb/BACKUP/pc/localhost/XferLOG.1.z Contents of file /mnt/usb/BACKUP/pc/localhost/XferLOG.1.z, modified 2006-08-04 02:09:50 Running: /usr/bin/sudo /bin/tar -c -v -f - -C / --totals --newer=2006-08-02 10:00:01 --exclude=./proc --exclude=./sys --exclude=./dev --exclude=./tmp --exclude=./var/log --exclude=./mnt/usb --exclude=./data/var/log --exclude=./data/var/spool --exclude=./data/BACKUP . Xfer PIDs are now 32057,32056 create 755 0/0 0 . create 755 0/0 0 lost+found create 755 0/0 0 etc create 644 0/0 624 etc/fstab create 755 0/0 0 etc/mkinitrd create 755 0/0 0 etc/mkinitrd/scripts create 755 0/0 0 etc/network Thats great, thanks! Rob Morin Dido InterNet Inc. Montreal, Canada Http://www.dido.ca 514-990-4444 Les Mikesell wrote: > On Fri, 2006-08-04 at 08:09, Rob Morin wrote: > >> How can i tell if it used the config.pl in te pc folder or the main one? >> >> It seemed to have work ok last night but not sure if it used the new >> config or the main one? the new config uses this for backing up the >> localhost >> > > If you view the XferLOG in the web interface (under the backup > summary list) it will show the command used. More generically > you can see the last time a file was accessed with > ls -lu filename, so if nothing else read it after the run > you might tell from the timestamp - unless the backup includes > it. > > |