You can subscribe to this list here.
2001 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(1) |
Oct
(19) |
Nov
(2) |
Dec
(23) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2002 |
Jan
(18) |
Feb
(15) |
Mar
(27) |
Apr
(6) |
May
(40) |
Jun
(53) |
Jul
(67) |
Aug
(107) |
Sep
(75) |
Oct
(74) |
Nov
(73) |
Dec
(63) |
2003 |
Jan
(93) |
Feb
(97) |
Mar
(72) |
Apr
(129) |
May
(110) |
Jun
(97) |
Jul
(151) |
Aug
(124) |
Sep
(66) |
Oct
(216) |
Nov
(105) |
Dec
(151) |
2004 |
Jan
(107) |
Feb
(181) |
Mar
(235) |
Apr
(212) |
May
(231) |
Jun
(231) |
Jul
(264) |
Aug
(278) |
Sep
(173) |
Oct
(259) |
Nov
(164) |
Dec
(244) |
2005 |
Jan
(318) |
Feb
(206) |
Mar
(287) |
Apr
(222) |
May
(240) |
Jun
(255) |
Jul
(166) |
Aug
(289) |
Sep
(233) |
Oct
(200) |
Nov
(307) |
Dec
(170) |
2006 |
Jan
(289) |
Feb
(270) |
Mar
(306) |
Apr
(150) |
May
(181) |
Jun
(263) |
Jul
(181) |
Aug
(291) |
Sep
(147) |
Oct
(155) |
Nov
(381) |
Dec
(310) |
2007 |
Jan
(431) |
Feb
(306) |
Mar
(378) |
Apr
(216) |
May
(313) |
Jun
(235) |
Jul
(373) |
Aug
(171) |
Sep
(459) |
Oct
(642) |
Nov
(464) |
Dec
(419) |
2008 |
Jan
(374) |
Feb
(445) |
Mar
(400) |
Apr
(406) |
May
(374) |
Jun
(346) |
Jul
(387) |
Aug
(302) |
Sep
(255) |
Oct
(374) |
Nov
(292) |
Dec
(488) |
2009 |
Jan
(392) |
Feb
(240) |
Mar
(245) |
Apr
(483) |
May
(310) |
Jun
(494) |
Jul
(265) |
Aug
(515) |
Sep
(514) |
Oct
(284) |
Nov
(338) |
Dec
(329) |
2010 |
Jan
(305) |
Feb
(246) |
Mar
(404) |
Apr
(391) |
May
(302) |
Jun
(166) |
Jul
(166) |
Aug
(234) |
Sep
(222) |
Oct
(267) |
Nov
(219) |
Dec
(244) |
2011 |
Jan
(189) |
Feb
(220) |
Mar
(353) |
Apr
(322) |
May
(270) |
Jun
(202) |
Jul
(172) |
Aug
(215) |
Sep
(226) |
Oct
(169) |
Nov
(163) |
Dec
(152) |
2012 |
Jan
(182) |
Feb
(221) |
Mar
(117) |
Apr
(151) |
May
(169) |
Jun
(135) |
Jul
(140) |
Aug
(108) |
Sep
(148) |
Oct
(97) |
Nov
(119) |
Dec
(66) |
2013 |
Jan
(105) |
Feb
(127) |
Mar
(265) |
Apr
(84) |
May
(75) |
Jun
(116) |
Jul
(89) |
Aug
(118) |
Sep
(132) |
Oct
(247) |
Nov
(98) |
Dec
(109) |
2014 |
Jan
(81) |
Feb
(101) |
Mar
(101) |
Apr
(79) |
May
(132) |
Jun
(102) |
Jul
(91) |
Aug
(114) |
Sep
(104) |
Oct
(126) |
Nov
(146) |
Dec
(46) |
2015 |
Jan
(51) |
Feb
(44) |
Mar
(83) |
Apr
(40) |
May
(68) |
Jun
(43) |
Jul
(38) |
Aug
(33) |
Sep
(88) |
Oct
(54) |
Nov
(53) |
Dec
(119) |
2016 |
Jan
(268) |
Feb
(42) |
Mar
(86) |
Apr
(73) |
May
(239) |
Jun
(93) |
Jul
(89) |
Aug
(60) |
Sep
(49) |
Oct
(66) |
Nov
(70) |
Dec
(34) |
2017 |
Jan
(81) |
Feb
(103) |
Mar
(161) |
Apr
(137) |
May
(230) |
Jun
(111) |
Jul
(135) |
Aug
(92) |
Sep
(118) |
Oct
(85) |
Nov
(110) |
Dec
(84) |
2018 |
Jan
(75) |
Feb
(59) |
Mar
(48) |
Apr
(50) |
May
(63) |
Jun
(44) |
Jul
(44) |
Aug
(61) |
Sep
(42) |
Oct
(108) |
Nov
(76) |
Dec
(48) |
2019 |
Jan
(38) |
Feb
(47) |
Mar
(18) |
Apr
(98) |
May
(47) |
Jun
(53) |
Jul
(48) |
Aug
(52) |
Sep
(33) |
Oct
(20) |
Nov
(30) |
Dec
(38) |
2020 |
Jan
(29) |
Feb
(49) |
Mar
(37) |
Apr
(87) |
May
(66) |
Jun
(98) |
Jul
(25) |
Aug
(49) |
Sep
(22) |
Oct
(124) |
Nov
(66) |
Dec
(26) |
2021 |
Jan
(131) |
Feb
(109) |
Mar
(71) |
Apr
(56) |
May
(29) |
Jun
(12) |
Jul
(36) |
Aug
(38) |
Sep
(54) |
Oct
(17) |
Nov
(38) |
Dec
(23) |
2022 |
Jan
(56) |
Feb
(56) |
Mar
(73) |
Apr
(25) |
May
(15) |
Jun
(22) |
Jul
(20) |
Aug
(36) |
Sep
(24) |
Oct
(21) |
Nov
(78) |
Dec
(42) |
2023 |
Jan
(47) |
Feb
(45) |
Mar
(31) |
Apr
(4) |
May
(15) |
Jun
(10) |
Jul
(37) |
Aug
(24) |
Sep
(21) |
Oct
(15) |
Nov
(15) |
Dec
(20) |
2024 |
Jan
(24) |
Feb
(37) |
Mar
(14) |
Apr
(23) |
May
(12) |
Jun
(1) |
Jul
(14) |
Aug
(34) |
Sep
(31) |
Oct
|
Nov
|
Dec
|
From: Chris K. <kac...@co...> - 2004-10-09 00:23:04
|
I went ahead and tried xtar, it does seem to support the options, but it is still failing, I think because the tar command on the linux machine is confused about what is being sent over the ssh pipe. I get errors like: tarExtract: Unable to open /data3/bbackups/pc/ski/new/f%2f/fUsers/fski/f.rsrc/fDocuments for writing tarExtract: Botch, no matches on /data3/bbackups/pc/ski/new/f%2f/fUsers/fski/f.rsrc/fDocuments (93011f76845edf2f4a83dedf0b12fa3b) pool d 700 501/501 64 Users/ski/.rsrc/Documents tarExtract: Unable to open /data3/bbackups/pc/ski/new/f%2f/fUsers/fski/f.rsrc/attrib for writing tarExtract: Botch, no matches on /data3/bbackups/pc/ski/new/f%2f/fUsers/fski/f.rsrc/attrib (de1d0b927858d15d7c2e3f4f989c180d) Any help is appreciated. cheers, ski -- "When we try to pick out anything by itself, we find it connected to the entire universe" John Muir Chris "Ski" Kacoroski, kac...@co..., 435-681-0092 |
From: Chris K. <kac...@co...> - 2004-10-09 00:05:53
|
Hi, I found the thread on Mac workstation backup, but did not see that it ever came to any resolution. I looked at using xtar, but it doesn't seem to support all the options that backuppc requires (e.g. --exclude and -C options). I am using RsyncX now (and it does seem to back up resource forks just fine to linux servers), but am not very happy with the load on the server. I understand that this load should be reduced some with Backuppc, but Backuppc does caching of the checksums. Is this correct? Has anyone used Backuppc with RsyncX (it seems to just be a patched version of rsync)? Thanks in advance. ski -- "When we try to pick out anything by itself, we find it connected to the entire universe" John Muir Chris "Ski" Kacoroski, kac...@co..., 435-681-0092 |
From: Lorrin N. <lhn...@ne...> - 2004-10-08 16:31:29
|
On 10/8/2004 4:26 AM, Lutz Atzert wrote: > After an iincremental Backup I got the output Got fatal error during > xfer (abortet by signal=ALRM) > Backup aborted by user signal > What can this mean? > Any Hints? > Lutz Have you tried searching the message archives? A potential answer to your question was posted less than a month ago: http://sourceforge.net/mailarchive/message.php?msg_id=9492438 Also see http://sourceforge.net/mailarchive/message.php?msg_id=9343330 -Lorrin |
From: <jse...@co...> - 2004-10-08 13:33:36
|
> > >Message: 1 >To: Oliver Freyd <Oli...@io...> >cc: bac...@li... >From: Craig Barratt <cba...@us...> >Subject: Re: [BackupPC-users] Newbie -- tarextract errors (WinXX) >Date: Tue, 05 Oct 2004 21:35:09 -0700 > >Oliver Freyd writes: > > > >> I've happily found out that multiple excludes seem to work >>in samba-3.0.6. I've run into the same problem as I wanted to >>back up a Windows machine used by a programmer, with lots of compiler- >>generated files... >> >>multiple excludes did not work with samba 3.0.2, but on another linux >>machine a had 3.0.6, and there it worked! Indeed after installing that >>it worked on the backuppc machine too. >> >> >You're right! The patch I submitted 1 year ago was applied in >March, so that implies 3.0.3 and beyond have the fix. > > https://bugzilla.samba.org/show_bug.cgi?id=389 > >Craig > Well I upgraded my smb to 3.0.7 (I was running 2.2.something.a with some security patch roll up; the default RH9 version) and even though the exclude files didn't work everything else did. No errors and all files were backed up!!! Now that it's working I don't think I'm going to exclude anything anymore but I am curious to know why the exlcude list didn't work. Is it because I didn't exclude the directories from the get go? In other words, I was doing all of "My Documents" and after 1 full backup and several Inc. backups I then added the following: $Conf{BackupFilesExclude} = ['/My Music/*', '/My Videos/*']; Last night when I ran a full backup those directores were backed up (mind you correctly for the first time since I upgraded samba; which is why I might as well remove them from the exclude list now :-) |
From: Lutz A. <slo...@ca...> - 2004-10-08 11:27:01
|
After an iincremental Backup I got the output Got fatal error during = xfer (abortet by signal=3DALRM) Backup aborted by user signal What can this mean? Any Hints? Lutz |
From: Craig B. <cba...@us...> - 2004-10-08 06:42:45
|
"Nathan Affleck" writes: > It's got to be possible, but it's not clear what needs to be done. I simply > want to backup a specific directory under the Documents and Settings > directory... For eg: > > $Conf{BackupFilesOnly} = '/Documents and Settings/*/My Documents'; > > Now, I know this is a problem with smbclient. But it sounded like there was > some ray of hope if one patched their smbclient with a specific regex hack? > I'm running the smbclient from fedora core 2 (specifically > samba-3.0.7-2.FC2). Was the tweak that has been mentioned incorporated into > the FEDORA core rpm? How can you tell? > > OR, is the tweak some regex option that needs to be implemented during > compile time (-Tr???) How? I didn't notice any --with-regex option with > the latest samba release. Unfortunately I don't believe you can't use a wildcard or regexp in $Conf{BackupFilesOnly}. Yes, smbclient can be compiled with HAVE_REGEX_H defined and then regexps can be used for exclude arguments. However, I agree that HAVE_REGEX_H doesn't appear in samba's configure, so I would guess you have to set it manually. I'm attaching some partially done FAQ on this issue. Craig ########################################################################### # Excluding and including files =head1 How do I exclude and include particular files? This is done with the two settings $Conf{BackupFilesOnly} and $Conf{BackupFilesExclude}. $Conf{BackupFilesOnly} is the list of directories or files to backup. Only these directories or files will be backed up. $Conf{BackupFilesOnly} does not accept wildcards. $Conf{BackupFilesExclude} can be used to exclude particular directories or types of files, eg: $Conf{BackupFilesExclude} = '/proc'; $Conf{BackupFilesExclude} = ['/proc', '/tmp']; $Conf{BackupFilesExclude} = ['*.mp3', '*.tmp']; The exact behavior depends upon the XferMethod used. For example, smb uses SmbClient, and it has limited capability for including and excluding particular files. In particular, with smb: =over 4 =item * only a single option (either $Conf{BackupFilesOnly} or $Conf{BackupFilesExclude} can be used. =item * very limited (dos-like) wildcards can be accepted. For example, this: $Conf{BackupFilesOnly} = '/Documents and Settings/craig/My Documents'; will work as expected: only the given folder will be backed up. However, this: $Conf{BackupFilesOnly} = '/Documents and Settings/*/My Documents'; will not work. =item * smbclient only accepts a single $Conf{BackupFilesExclude} argument. This is fixed starting in samba-3.0.3. =back =head1 How do I use rsync to include and exclude files? Rsync and rsyncd provide the most flexible methods for including and excluding files. The basic features are supported by $Conf{BackupFilesOnly} and $Conf{BackupFilesExclude}, as explained above. $Conf{BackupFilesOnly} is relative to $Conf{RsyncShareName}. For example, if $Conf{RsyncShareName} is /home, then: $Conf{BackupFilesOnly} = ['/craig', '/bill']; will backup only /home/craig and /home/bill. The same applies to rsyncd: $Conf{BackupFilesOnly} is relative to the module directory. For example, if a module called 'home' points to /home, then the above example will backup only /home/craig and /home/bill. $Conf{BackupFilesExclude} can use quite flexible wildcards. If an entry starts with a '/' then it refers to an absolute path, rooted at the $Conf{RsyncShareName}. Significantly more general exclude and include options can be specified by directly setting rsync arguments. See the rsync manual page for more information. For rsync, $Conf{BackupFilesOnly} and $Conf{BackupFilesExclude} are implemented by converting them into the --include and --exclude arguments to rsync. $Conf{BackupFilesExclude} converts directly to --exclude arguments, and $Conf{BackupFilesOnly} might translate to several --include and --exclude arguments so that just the desired directories are included. See the rsync arguments in the XferLOG file if you want to see the result of this translation. It is possible to mix $Conf{BackupFilesOnly}, $Conf{BackupFilesExclude} and additional arguments to rsync. =head1 How do I backup just My Documents? For a particular user, eg: craig, you can set: $Conf{BackupFilesOnly} = '/Documents and Settings/craig/My Documents'; However, smbclient doesn't expand wildcards in the middle of paths, so this will not work with smbclient to backup all the user's My Documents folders: $Conf{BackupFilesOnly} = '/Documents and Settings/*/My Documents'; If you want to use smbclient, one choice is to place all the user's documents in a single folder, as suggested by Erich Vinson: =over 4 =item * make one folder, C:\docs =item * create sub-folders under that, C:\docs\user1 etc. =item * set permissions so that user1 can't access user2's folder, etc. =item * set each user's 'My Documents' location to point to the new folder, answering 'yes' when asked if you want to move the files =item * create a user called 'backup'. =item * share C:\docs as 'docs' giving only the backup user permission on the share (remember share permissions are different than NTFS permissions). =item * If Outlook is used, move the outlook.pst (and possibly archive.pst and extend.dat) to the new location. Outlook will whine about not being able to find its PST file, just point it in the right direction and it will continue chugging right along. =item * Then just backup the 'docs' share on each machine. It has worked extremely well for me. =back |
From: Nathan A. <nea...@uc...> - 2004-10-08 00:10:19
|
Hi, I'm new to the BackupPC scene. Loving it. Might just be the ticket for us. I've searched through the mail lists, and the web, and seem to be slightly confused on the use of the BackupFilesOnly option. I'm using the smb Xfer option, and I think it's pretty much the only choice for now (without distributing rsyncd and cygwin on a ton of machines)... It's got to be possible, but it's not clear what needs to be done. I simply want to backup a specific directory under the Documents and Settings directory... For eg: $Conf{BackupFilesOnly} = '/Documents and Settings/*/My Documents'; Now, I know this is a problem with smbclient. But it sounded like there was some ray of hope if one patched their smbclient with a specific regex hack? I'm running the smbclient from fedora core 2 (specifically samba-3.0.7-2.FC2). Was the tweak that has been mentioned incorporated into the FEDORA core rpm? How can you tell? OR, is the tweak some regex option that needs to be implemented during compile time (-Tr???) How? I didn't notice any --with-regex option with the latest samba release. Thanks all, ..Nate |
From: Ryan L. <rya...@gm...> - 2004-10-07 21:00:47
|
If you really wanted to do this.... and for gods sakes I dont know why... You could just setup a simple cron, that would rotate the output. Either umount and mount each drive at certain times, or hell, just rotate a symlink location every 24 hours... OR JUST USE RAID Ryan On Thu, 07 Oct 2004 09:49:41 -0700, Craig Barratt <cba...@us...> wrote: > DoM writes: > > > So it will be more complicated to do this job... and i am trying to find > > simplest solution :P > > In addition to numerous complications mentioned by Daniel and Les, > another problem with rotating the BackupPC storage nightly through > three disks is that it will have to do three times the number of > full backups, which increases the client impact. > > A weekly rotation (equal to the full period) with removable disks makes > sense for off-site storage. > > Craig > > > > > ------------------------------------------------------- > This SF.net email is sponsored by: IT Product Guide on ITManagersJournal > Use IT products in your business? Tell us what you think of them. Give us > Your Opinions, Get Free ThinkGeek Gift Certificates! Click to find out more > http://productguide.itmanagersjournal.com/guidepromo.tmpl > _______________________________________________ > BackupPC-users mailing list > Bac...@li... > https://lists.sourceforge.net/lists/listinfo/backuppc-users > http://backuppc.sourceforge.net/ > |
From: Ryan L. <rya...@gm...> - 2004-10-07 20:26:50
|
Wow... thanks to both of you... This is what happens when you do crap at 5 in the morining. Thanks, Ryan On Thu, 07 Oct 2004 09:29:08 -0700, Craig Barratt <cba...@us...> wrote: > > > Ryan Leonard writes: > > > Hey guys, after searching around for this problem, I am out of ideas? > > > > I have a backuppc server running on a Debian Linux machine. Most of my > > backups are samba, but I am now trying to add a tar over ssh server. > > My config file looks like: > > > > $Conf{XferMethod} = 'tar'; > > $Conf{TarClientPath} = '/bin/tar'; > > $Conf{TarShareName} = '/root'; > > $Conf{TarClientCmd} = '$sshPath -q -x -n -l root $host'; > > $Conf{TarFullArgs} = '$fileList+'; > > $Conf{TarIncrArgs} = '--newer=$incrDate+ $fileList+'; > > $Conf{TarClientRestoreCmd} = '$sshPath -q -x -l root $host'; > > > > And here is what I am getting: > > > > 2004-10-07 05:32:20 full backup started for directory /root > > 2004-10-07 05:32:22 Got fatal error during xfer (Tar exited with error > > 512 () status) > > 2004-10-07 05:32:27 Backup aborted (Tar exited with error 512 () status) > > 2004-10-07 05:32:27 Saved partial dump 0 > > Look at the XferLOG.bad.z file for the exact command and error. > > Your $Conf{TarClientCmd} looks wrong. It just runs ssh, not ssh+tar. > The default value is: > > $Conf{TarClientCmd} = '$sshPath -q -x -n -l root $host' > . ' $tarPath -c -v -f - -C $shareName+' > . ' --totals'; > > In perl "." means concatenation; it's really one big string: > > $Conf{TarClientCmd} = '$sshPath -q -x -n -l root $host $tarPath -c -v -f - -C $shareName+ --totals'; > > I suspect you just copied the first line, and not the rest. > > Craig > |
From: Craig B. <cba...@us...> - 2004-10-07 16:50:43
|
DoM writes: > So it will be more complicated to do this job... and i am trying to find > simplest solution :P In addition to numerous complications mentioned by Daniel and Les, another problem with rotating the BackupPC storage nightly through three disks is that it will have to do three times the number of full backups, which increases the client impact. A weekly rotation (equal to the full period) with removable disks makes sense for off-site storage. Craig |
From: Lorrin N. <lhn...@ne...> - 2004-10-07 16:47:40
|
On 10/7/2004 9:38 AM, Arnaud THEBAULT wrote: > Hi, > > I am new to BackupPc and I have a question about the backup destination. > I use rsyncd way to do Backups so I think that datas are "pushed" by the > client's rsync, is it right ? > Because I would like to store backups on NAS on my network, is it > possible to configure BackupPc on the client side to directly send file > to the NAS without going thru the server ? (ie : using a local path on > the client's side) > I'm pretty sure the answer is no. With rsyncd the backups are still pulled by the server; the client is just running a daemon that facilitates the pull. You'd need to mount the NAS volume on your server and backup to it. Since the way BackupPC stores files you can't get at them without going through the BackupPC web interface this still might not offer the benefit you're looking for. -Lorrin |
From: Arnaud T. <ath...@jo...> - 2004-10-07 16:38:38
|
Hi, I am new to BackupPc and I have a question about the backup destination. I use rsyncd way to do Backups so I think that datas are "pushed" by the client's rsync, is it right ? Because I would like to store backups on NAS on my network, is it possible to configure BackupPc on the client side to directly send file to the NAS without going thru the server ? (ie : using a local path on the client's side) Sincerly, Arnaud THEBAULT |
From: Lorrin N. <lhn...@ne...> - 2004-10-07 16:36:44
|
On 10/7/2004 9:07 AM, Les Mikesell wrote: > On Thu, 2004-10-07 at 10:52, DoM wrote: > >>It looklike a nice idea but for example if backup doesnt finish on time ? >> >>I mean for example if i unmount/mount in crontab 2th partition at 4 >>p.m. and backup still running ? >> >>It will be dangerous. > > > If you can't finish a day's scheduled backups in 24 hours you have > a problem whether you unmount the partition or not. > > >>Cause i will not deactive any partition. >>All 3 hds still mounted and active. > > > OK, how do you expect the web server to be aware of more than > one partition, or the backup server to be able to restore > from partitions that aren't in it's configuration to back up? > You should also be aware that backuppc can't do any of it's > magic to link duplicate data across different partitions > so unless you RAID them, you won't gain any efficiency by > putting the 3 drives in the same PC. > I think he was envisioning not that one BackupPC process would manage all that but rather than multiple copies of BackupPC would run at the same time, presumably served by Apache with separate virtual hosts (perhaps on diff't ports). -Lorrin |
From: Craig B. <cba...@us...> - 2004-10-07 16:30:10
|
Ryan Leonard writes: > Hey guys, after searching around for this problem, I am out of ideas? > > I have a backuppc server running on a Debian Linux machine. Most of my > backups are samba, but I am now trying to add a tar over ssh server. > My config file looks like: > > $Conf{XferMethod} = 'tar'; > $Conf{TarClientPath} = '/bin/tar'; > $Conf{TarShareName} = '/root'; > $Conf{TarClientCmd} = '$sshPath -q -x -n -l root $host'; > $Conf{TarFullArgs} = '$fileList+'; > $Conf{TarIncrArgs} = '--newer=$incrDate+ $fileList+'; > $Conf{TarClientRestoreCmd} = '$sshPath -q -x -l root $host'; > > And here is what I am getting: > > 2004-10-07 05:32:20 full backup started for directory /root > 2004-10-07 05:32:22 Got fatal error during xfer (Tar exited with error > 512 () status) > 2004-10-07 05:32:27 Backup aborted (Tar exited with error 512 () status) > 2004-10-07 05:32:27 Saved partial dump 0 Look at the XferLOG.bad.z file for the exact command and error. Your $Conf{TarClientCmd} looks wrong. It just runs ssh, not ssh+tar. The default value is: $Conf{TarClientCmd} = '$sshPath -q -x -n -l root $host' . ' $tarPath -c -v -f - -C $shareName+' . ' --totals'; In perl "." means concatenation; it's really one big string: $Conf{TarClientCmd} = '$sshPath -q -x -n -l root $host $tarPath -c -v -f - -C $shareName+ --totals'; I suspect you just copied the first line, and not the rest. Craig |
From: Les M. <le...@fu...> - 2004-10-07 16:08:13
|
On Thu, 2004-10-07 at 10:52, DoM wrote: > It looklike a nice idea but for example if backup doesnt finish on time ? > > I mean for example if i unmount/mount in crontab 2th partition at 4 > p.m. and backup still running ? > > It will be dangerous. If you can't finish a day's scheduled backups in 24 hours you have a problem whether you unmount the partition or not. > Cause i will not deactive any partition. > All 3 hds still mounted and active. OK, how do you expect the web server to be aware of more than one partition, or the backup server to be able to restore from partitions that aren't in it's configuration to back up? You should also be aware that backuppc can't do any of it's magic to link duplicate data across different partitions so unless you RAID them, you won't gain any efficiency by putting the 3 drives in the same PC. --- Les Mikesell le...@fu... |
From: DoM <do...@mi...> - 2004-10-07 15:52:32
|
It looklike a nice idea but for example if backup doesnt finish on time ? I mean for example if i unmount/mount in crontab 2th partition at 4 p.m. and backup still running ? It will be dangerous. I don't know.I don't have much time in Blackout period .. maybe 10 hours. And actually backup run 1 day for 5 servers and day after other 5. Cause if i do all 10 backups in 1 day it dont finish on time. So it will be more complicated to do this job... and i am trying to find simplest solution :P >How do you expect to restore files from a partition which is not >currently active? > > Cause i will not deactive any partition. All 3 hds still mounted and active. Les Mikesell wrote: >On Thu, 2004-10-07 at 09:23, DoM wrote: > > >>> >>> >>> >>Answer: Step 5 -> Client Setup >>(Or you could run two completely separate instances of BackupPC, with >>different data directories, one for WinXX and the other for >>linux/unix, ...) >> >>I don't want use RAID neither hardware 1 or software 1. >> >>I tell you: I want for each day a different partition to backup datas. >> >> > >I'd probably try to find additional PC's to run the alternate copies >since it could either be someone's desktop that is idle at night or >something used and cheap. For the expense/effort you would have >real redundancy and all copies on line at the same time. But, as >a stab at running multiple different copies at different times on >the same PC, why not put the entire installation under /opt/backuppc >and have a cron job shut down backuppc, unmount the current partition >and mount the new correct partition at a certain time of day, then >restart backuppc? Or you might keep them all mounted under different >mount points and adjust a symlink named /opt/backuppc to point to >the active one. > > > >>It will not be 3 identical Backup and why it will be much order to >>recover files ? >> >> > >How do you expect to restore files from a partition which is not >currently active? > >--- > Les Mikesell > le...@fu... > > > > > > -- Thx & Byez DoM |
From: Les M. <le...@fu...> - 2004-10-07 15:30:28
|
On Thu, 2004-10-07 at 09:23, DoM wrote: > > > Answer: Step 5 -> Client Setup > (Or you could run two completely separate instances of BackupPC, with > different data directories, one for WinXX and the other for > linux/unix, ...) > > I don't want use RAID neither hardware 1 or software 1. > > I tell you: I want for each day a different partition to backup datas. I'd probably try to find additional PC's to run the alternate copies since it could either be someone's desktop that is idle at night or something used and cheap. For the expense/effort you would have real redundancy and all copies on line at the same time. But, as a stab at running multiple different copies at different times on the same PC, why not put the entire installation under /opt/backuppc and have a cron job shut down backuppc, unmount the current partition and mount the new correct partition at a certain time of day, then restart backuppc? Or you might keep them all mounted under different mount points and adjust a symlink named /opt/backuppc to point to the active one. > It will not be 3 identical Backup and why it will be much order to > recover files ? How do you expect to restore files from a partition which is not currently active? --- Les Mikesell le...@fu... |
From: DoM <do...@mi...> - 2004-10-07 14:25:01
|
Answer: Daniel Pittman wrote: >On 7 Oct 2004, do...@mi... wrote: > > >>I would like to know howto setup BackupPC to work with more sessions. >> >>My situation is: >> >>10 servers about to backup. >>1 BackupPC server with 3 hd 200GB. >> >>I want that 1th session of BackupPC will backup 10 servers on 1th hard >>disk on monday, 2th session of BackupPC on tuesday on 2th hd and go on >>like this for third session and third hd. >> >> > >Question: why? > >I can't see any reason why you would want this setup since, at the end >of the day, it would give you three almost identical BackupPC data >stores, and would make it much harder to recover files from any of them. > >Perhaps you are not sure how to take advantage of the extra storage >space for a single installation, and should take a look at something >like Linux software RAID? > >Otherwise, maybe you can tell us why you want to achieve this, since it >is a very unusual situation and there may be a better way to get your >desired results. > >Regards, > Daniel > > Answer: Step 5 -> Client Setup (Or you could run two completely separate instances of BackupPC, with different data directories, one for WinXX and the other for linux/unix, ...) I don't want use RAID neither hardware 1 or software 1. I tell you: I want for each day a different partition to backup datas. It will not be 3 identical Backup and why it will be much order to recover files ? -- Thx & Byez DoM |
From: Torsten S. <Tor...@tu...> - 2004-10-07 14:16:27
|
The xfer log is very talkative about all errors and gives you the comman= d line for your backup as well. What does it say there and what do you g= et if you run the command line in a terminal? Torsten On 2004-10-07 11:47:34 +0200 Ryan Leonard <rya...@gm...> w= rote: > Hey guys, after searching around for this problem, I am out of ideas=85= >=20 > I have a backuppc server running on a Debian Linux machine. Most of my= > backups are samba, but I am now trying to add a tar over ssh server. > My config file looks like: >=20 > $Conf{XferMethod} =3D 'tar'; > $Conf{TarClientPath} =3D '/bin/tar'; > $Conf{TarShareName} =3D '/root'; > $Conf{TarClientCmd} =3D '$sshPath -q -x -n -l root $host'; > $Conf{TarFullArgs} =3D '$fileList+'; > $Conf{TarIncrArgs} =3D '--newer=3D$incrDate+ $fileList+'; > $Conf{TarClientRestoreCmd} =3D '$sshPath -q -x -l root $host'; >=20 > And here is what I am getting: >=20 > 2004-10-07 05:32:20 full backup started for directory /root > 2004-10-07 05:32:22 Got fatal error during xfer (Tar exited with error= > 512 () status) > 2004-10-07 05:32:27 Backup aborted (Tar exited with error 512 () statu= s) > 2004-10-07 05:32:27 Saved partial dump 0 >=20 > Any ideas? Thanks, > Ryan >=20 >=20 > ------------------------------------------------------- > This SF.net email is sponsored by: IT Product Guide on ITManagersJourn= al > Use IT products in your business? Tell us what you think of them. Give= us > Your Opinions, Get Free ThinkGeek Gift Certificates! Click to find out= more > http://productguide.itmanagersjournal.com/guidepromo.tmpl > _______________________________________________ > BackupPC-users mailing list > Bac...@li... > https://lists.sourceforge.net/lists/listinfo/backuppc-users > http://backuppc.sourceforge.net/ >=20 > |
From: Daniel P. <da...@ri...> - 2004-10-07 13:08:26
|
On 7 Oct 2004, do...@mi... wrote: > I would like to know howto setup BackupPC to work with more sessions. > > My situation is: > > 10 servers about to backup. > 1 BackupPC server with 3 hd 200GB. > > I want that 1th session of BackupPC will backup 10 servers on 1th hard > disk on monday, 2th session of BackupPC on tuesday on 2th hd and go on > like this for third session and third hd. Question: why? I can't see any reason why you would want this setup since, at the end of the day, it would give you three almost identical BackupPC data stores, and would make it much harder to recover files from any of them. Perhaps you are not sure how to take advantage of the extra storage space for a single installation, and should take a look at something like Linux software RAID? Otherwise, maybe you can tell us why you want to achieve this, since it is a very unusual situation and there may be a better way to get your desired results. Regards, Daniel -- Thus, if the First Amendment means anything in the field, it must allow protests even against the moral code that the standard of the day sets for the community. In other words, literature should not be suppressed merely because it offends the moral code of the censor. -- William O. Douglas, opinion, _Roth v. U.S._, 354 U.S. 476 (1957) |
From: Ryan L. <rya...@gm...> - 2004-10-07 09:47:39
|
Hey guys, after searching around for this problem, I am out of ideas=E2=80= =A6 I have a backuppc server running on a Debian Linux machine. Most of my backups are samba, but I am now trying to add a tar over ssh server. My config file looks like: $Conf{XferMethod} =3D 'tar'; $Conf{TarClientPath} =3D '/bin/tar'; $Conf{TarShareName} =3D '/root'; $Conf{TarClientCmd} =3D '$sshPath -q -x -n -l root $host'; $Conf{TarFullArgs} =3D '$fileList+'; $Conf{TarIncrArgs} =3D '--newer=3D$incrDate+ $fileList+'; $Conf{TarClientRestoreCmd} =3D '$sshPath -q -x -l root $host'; And here is what I am getting: 2004-10-07 05:32:20 full backup started for directory /root 2004-10-07 05:32:22 Got fatal error during xfer (Tar exited with error 512 () status) 2004-10-07 05:32:27 Backup aborted (Tar exited with error 512 () status) 2004-10-07 05:32:27 Saved partial dump 0 Any ideas? Thanks, Ryan |
From: DoM <do...@mi...> - 2004-10-07 07:55:03
|
Hi all, I would like to know howto setup BackupPC to work with more sessions. My situation is: 10 servers about to backup. 1 BackupPC server with 3 hd 200GB. I want that 1th session of BackupPC will backup 10 servers on 1th hard disk on monday, 2th session of BackupPC on tuesday on 2th hd and go on like this for third session and third hd. I don't understand howto tell to BackupPC binary to read an alternate main config.pl cause only option that i saw was -daemon and nothing more. For alternate day policy i've just discuss with Craig some weeks ago in ml and everything actually works perfect( Thx Craig :) ). -- Thx & Byez DoM |
From: Josh M. <jo...@wo...> - 2004-10-07 01:35:58
|
> Color me silly, but I can't work out how to get BackupPC to generate an > archive from a script. Try running the BackupPC_tarCreate directly. e.g. BackupPC_tarCreate -h <hostname> -n -1 -s <sharename> / > dumpfilename.tar The "-n -1" means create a tar archive of the latest backup. Regards, Josh. |
From: Daniel P. <da...@ri...> - 2004-10-07 01:01:54
|
Color me silly, but I can't work out how to get BackupPC to generate an archive from a script. What I need to achieve, specifically, is to have a scheduled offline backup made on a weekly basis, for some of the hosts we use. I am sure that it is possible, just like you can trigger a backup manually, but I can't for the life of me work out quite how. Can anyone supply the exact commands required to make this work, please? Regards, Daniel -- I believe that the depths of metaphor and symbol can convey what words cannot, perhaps allowing for resolution of what appears impossible to resolve. -- Suzanne McLeod |
From: Rich D. <rdu...@th...> - 2004-10-06 19:04:42
|
For some years, I've periodically run lifesave.bat to make a copy of the registry and some other critical file. I suspect you could do something similar on a daily basis, which will give backuppc something to grab. See here: http://www.geocities.com/~budallen/batfiles.html I don't know how well this strategy holds up on newer versions of windows, however, so it would be wise to do a test. On Wed, 2004-10-06 at 12:36, Les Mikesell wrote: > On Wed, 2004-10-06 at 10:40, Lorrin Nelson wrote: > > > See if you've even got NTUSER.DAT and USRCLASS.DAT in your backup. I'm > > pretty sure those files contain the registry and can't be backed up > > while Windows is running. If you figure out a simple way to back those > > up let me know. Without restoring the registry you can't restore the > > system and you have to reinstall all the apps and then use BackupPC just > > for your data. > > I haven't tried this, but it might work to use the windows NTBACKUP > program to dump the 'system state' to a file before backuppc runs. > Then you would install windows on the system or start from an image > backup that would not have to be recent, restore from backuppc, then > do an ntbackup restore from the .bkf file that would have been carried > along by the backuppc restore. There may be other files that need > special handling by ntbackup and there are some contortions you have to > go through if the target hardware isn't identical. If anyone tries > this approach, please post the results here. > > --- > Les Mikesell > le...@fu... > > > > > ------------------------------------------------------- > This SF.net email is sponsored by: IT Product Guide on ITManagersJournal > Use IT products in your business? Tell us what you think of them. Give us > Your Opinions, Get Free ThinkGeek Gift Certificates! Click to find out more > http://productguide.itmanagersjournal.com/guidepromo.tmpl > _______________________________________________ > BackupPC-users mailing list > Bac...@li... > https://lists.sourceforge.net/lists/listinfo/backuppc-users > http://backuppc.sourceforge.net/ -- Regards, Rich |