You can subscribe to this list here.
| 2001 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(1) |
Oct
(19) |
Nov
(2) |
Dec
(23) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2002 |
Jan
(18) |
Feb
(15) |
Mar
(27) |
Apr
(6) |
May
(40) |
Jun
(53) |
Jul
(67) |
Aug
(107) |
Sep
(75) |
Oct
(74) |
Nov
(73) |
Dec
(63) |
| 2003 |
Jan
(93) |
Feb
(97) |
Mar
(72) |
Apr
(129) |
May
(110) |
Jun
(97) |
Jul
(151) |
Aug
(124) |
Sep
(66) |
Oct
(216) |
Nov
(105) |
Dec
(151) |
| 2004 |
Jan
(107) |
Feb
(181) |
Mar
(235) |
Apr
(212) |
May
(231) |
Jun
(231) |
Jul
(264) |
Aug
(278) |
Sep
(173) |
Oct
(259) |
Nov
(164) |
Dec
(244) |
| 2005 |
Jan
(318) |
Feb
(206) |
Mar
(287) |
Apr
(222) |
May
(240) |
Jun
(255) |
Jul
(166) |
Aug
(289) |
Sep
(233) |
Oct
(200) |
Nov
(307) |
Dec
(170) |
| 2006 |
Jan
(289) |
Feb
(270) |
Mar
(306) |
Apr
(150) |
May
(181) |
Jun
(263) |
Jul
(181) |
Aug
(291) |
Sep
(147) |
Oct
(155) |
Nov
(381) |
Dec
(310) |
| 2007 |
Jan
(431) |
Feb
(306) |
Mar
(378) |
Apr
(216) |
May
(313) |
Jun
(235) |
Jul
(373) |
Aug
(171) |
Sep
(459) |
Oct
(642) |
Nov
(464) |
Dec
(419) |
| 2008 |
Jan
(374) |
Feb
(445) |
Mar
(400) |
Apr
(406) |
May
(374) |
Jun
(346) |
Jul
(387) |
Aug
(302) |
Sep
(255) |
Oct
(374) |
Nov
(292) |
Dec
(488) |
| 2009 |
Jan
(392) |
Feb
(240) |
Mar
(245) |
Apr
(483) |
May
(310) |
Jun
(494) |
Jul
(265) |
Aug
(515) |
Sep
(514) |
Oct
(284) |
Nov
(338) |
Dec
(329) |
| 2010 |
Jan
(305) |
Feb
(246) |
Mar
(404) |
Apr
(391) |
May
(302) |
Jun
(166) |
Jul
(166) |
Aug
(234) |
Sep
(222) |
Oct
(267) |
Nov
(219) |
Dec
(244) |
| 2011 |
Jan
(189) |
Feb
(220) |
Mar
(353) |
Apr
(322) |
May
(270) |
Jun
(202) |
Jul
(172) |
Aug
(215) |
Sep
(226) |
Oct
(169) |
Nov
(163) |
Dec
(152) |
| 2012 |
Jan
(182) |
Feb
(221) |
Mar
(117) |
Apr
(151) |
May
(169) |
Jun
(135) |
Jul
(140) |
Aug
(108) |
Sep
(148) |
Oct
(97) |
Nov
(119) |
Dec
(66) |
| 2013 |
Jan
(105) |
Feb
(127) |
Mar
(265) |
Apr
(84) |
May
(75) |
Jun
(116) |
Jul
(89) |
Aug
(118) |
Sep
(132) |
Oct
(247) |
Nov
(98) |
Dec
(109) |
| 2014 |
Jan
(81) |
Feb
(101) |
Mar
(101) |
Apr
(79) |
May
(132) |
Jun
(102) |
Jul
(91) |
Aug
(114) |
Sep
(104) |
Oct
(126) |
Nov
(146) |
Dec
(46) |
| 2015 |
Jan
(51) |
Feb
(44) |
Mar
(83) |
Apr
(40) |
May
(68) |
Jun
(43) |
Jul
(38) |
Aug
(33) |
Sep
(88) |
Oct
(54) |
Nov
(53) |
Dec
(119) |
| 2016 |
Jan
(268) |
Feb
(42) |
Mar
(86) |
Apr
(73) |
May
(239) |
Jun
(93) |
Jul
(89) |
Aug
(60) |
Sep
(49) |
Oct
(66) |
Nov
(70) |
Dec
(34) |
| 2017 |
Jan
(81) |
Feb
(103) |
Mar
(161) |
Apr
(137) |
May
(230) |
Jun
(111) |
Jul
(135) |
Aug
(92) |
Sep
(118) |
Oct
(85) |
Nov
(110) |
Dec
(84) |
| 2018 |
Jan
(75) |
Feb
(59) |
Mar
(48) |
Apr
(50) |
May
(63) |
Jun
(44) |
Jul
(44) |
Aug
(61) |
Sep
(42) |
Oct
(108) |
Nov
(76) |
Dec
(48) |
| 2019 |
Jan
(38) |
Feb
(47) |
Mar
(18) |
Apr
(98) |
May
(47) |
Jun
(53) |
Jul
(48) |
Aug
(52) |
Sep
(33) |
Oct
(20) |
Nov
(30) |
Dec
(38) |
| 2020 |
Jan
(29) |
Feb
(49) |
Mar
(37) |
Apr
(87) |
May
(66) |
Jun
(98) |
Jul
(25) |
Aug
(49) |
Sep
(22) |
Oct
(124) |
Nov
(66) |
Dec
(26) |
| 2021 |
Jan
(131) |
Feb
(109) |
Mar
(71) |
Apr
(56) |
May
(29) |
Jun
(12) |
Jul
(36) |
Aug
(38) |
Sep
(54) |
Oct
(17) |
Nov
(38) |
Dec
(23) |
| 2022 |
Jan
(56) |
Feb
(56) |
Mar
(73) |
Apr
(25) |
May
(15) |
Jun
(22) |
Jul
(20) |
Aug
(36) |
Sep
(24) |
Oct
(21) |
Nov
(78) |
Dec
(42) |
| 2023 |
Jan
(47) |
Feb
(45) |
Mar
(31) |
Apr
(4) |
May
(15) |
Jun
(10) |
Jul
(37) |
Aug
(24) |
Sep
(21) |
Oct
(15) |
Nov
(15) |
Dec
(20) |
| 2024 |
Jan
(24) |
Feb
(37) |
Mar
(14) |
Apr
(23) |
May
(12) |
Jun
(1) |
Jul
(14) |
Aug
(34) |
Sep
(38) |
Oct
(13) |
Nov
(33) |
Dec
(14) |
| 2025 |
Jan
(15) |
Feb
(19) |
Mar
(28) |
Apr
(12) |
May
(23) |
Jun
(43) |
Jul
(32) |
Aug
(29) |
Sep
(11) |
Oct
(17) |
Nov
(66) |
Dec
(34) |
| 2026 |
Jan
(15) |
Feb
(2) |
Mar
(12) |
Apr
(9) |
May
(7) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
|
From: <bac...@su...> - 2026-05-05 16:32:54
|
Thanks for the detailed reply. I will follow these steps this weekend and see how far I get. On Tue, May 5, 2026, at 12:09 PM, G.W. Haywood wrote: > Hi there, > > On Mon, 4 May 2026, bac...@su... wrote: > > > The host running backuppc on my network had its root drive fail. > > :( > > > The actual backup storage was on an external drive which I had > > mounted on /var/lib/backuppc. That external drive is still healthy. > > :) > > > I remember spending a lot of time getting backuppc to work just > > right and dread starting over. > > My early recollections are similar but I wrote it all in my notebook. > It's a habit I developed in my first job - at an explosives research > laboratory in 1971. It's a habit I've kept up ever since, and over > the years it's sometimes been incredibly useful. > > > Since the external drive includes the backup for localhost (the > > former box running backuppc), it should include a backup for > > /etc/backuppc and the config.pl file (I forget which directories I > > set for backup for localhost; hopefully I included /etc). > > Hopefully. Hmmm. > > > It seems there should be a way I can extract my now-lost > > /etc/backuppc/config.pl and other files in that directory. I am > > hoping I can install backuppc fresh, then grab the /etc/backuppc > > files from the external drive, and swap them in. Is this possible? > > Hopefully. If you didn't back up the files, they're probably gone. > > You didn't say what version of BackupPC you're using, so I assume V4. > > First I suggest that you make a copy of your backup drive on another > medium of some sort. Put that copy somewhere safe. Next install a > new version of BackupPC the same way that you installed it originally, > for example if you used a distro package use the same distro package. > At this stage don't try to connect the old backup drive (nor the copy) > but simply get the BackupPC system working. You can use any old host > on the network as a dummy to start making backups. To get you started > set up the configuration to back up one file on it, say every hour or > something like that. > > Once you have the new BackupPC working, shut it down - so that now the > BackupPC script isn't running. Now edit the BackupPC 'hosts' file so > that it contains the hosts that you had on the original system (if you > can't remember them, the hostnames are the names of directories in the > directory .../pc/ of the BackupPC backup data store). > > At this point, if you started BackupPC, it could try to start backing > up your hosts. Presumably you won't want it to do that because you'll > have individual configurations (for example in /etc/backuppc/pc/ [*]) > for each of your hosts, and before starting the backups you'll want to > get those configuratons from a backup. You can do that as follows: > > Replace the 'new' backup store with your 'old' backup store. I'll let > you work out how to do that, from what you've written you're obviously > capable of doing it. When you've done it, you *could* start BackupPC, > but you still don't want to start backups - you still don't have your > old configurations. However, you *can* now use the tools provided by > BackupPC to get files and directories from the backup data store... > > 'BackupPC_ls' lets you list the files in any directory from any backup > of any share of any host. It gives you an md5sum for each file listed. > Here's an example, where I'm logged in as root on the backup server: > > 8<---------------------------------------------------------------------- > > # su -c "/usr/local/BackupPC/bin/BackupPC_ls -h alpha -n 2235 -s Config /" backuppc > /: > -rw-r--r-- 0/0 290 2019-09-26 01:24:33 /.fstab (0416cfcbe01474d6f0526ffa5d890813) > drwxr-xr-x 0/0 0 2020-05-07 15:45:30 /.java/ > -rw------- 0/0 0 2019-09-26 01:05:24 /.pwd.lock (d41d8cd98f00b204e9800998ecf8427e) > drwxr-x--- 0/1002 0 2026-03-26 13:55:12 /BackupPC/ > drwxr-xr-x 0/0 0 2025-02-01 16:36:24 /ImageMagick-6/ > ... > ... > > 8<---------------------------------------------------------------------- > > As you can see, BackupPC_ls (normally) has to be run under the ID of > the backup user, usually 'backuppc' but of course that's configurable. > You give it the host name, in this case 'alpha', the backup number, in > this case 2235, the share name, in this case 'Config', and the name of > the directory that you want to list, in this case '/'. The output is > a bit like that from 'ls -l' but you get md5sums for the files (not > for directories). In my example above I've given the full path to the > BackupPC_ls utility, it will likely be different in your installation. > If you can't remember the share names, they're in the numbered backup > directories under the host name directory with 'f' prefixed to them. > The file 'backups' in the top level of the host's directory shows you > which backup numbers are full and which are incremental. > > 'BackupPC_zcat' then lets you write the content of any file to stdout. > You just give it the md5sum which BackupPC_ls provided for your file. > Below I've recovered the file '.fstab' which has the md5sum '0416cf...': > > 8<---------------------------------------------------------------------- > > # su -c '/usr/local/BackupPC/bin/BackupPC_zcat 0416cfcbe01474d6f0526ffa5d890813' backuppc > proc /proc proc defaults 0 0 > PARTUUID=6c586e13-01 /boot vfat defaults 0 2 > PARTUUID=6c586e13-02 / ext4 defaults,noatime 0 1 > PARTUUID=6c586e13-03 /var ext4 defaults,noatime 0 1 > # > > 8<---------------------------------------------------------------------- > > You can redirect stdout to a file of course. That's all there is to it. > > > The names on the files on the external drive are all user-unfriendly. > > A consequence of BackupPC's way of de-duplicating things is that the > files in the backup data store don't have the names that they have on > the backed-up storage devices. The name of a file is the md5sum of > the files's content, with the added wrinkle that the file in the data > store will probably be compressed. The pseudo-filesystems under the > BackupPC host name directories index into the data store using md5sums > as pointers. The pool files are split into 128 subdirectories, each > of which is further split into 128 subdirectories, so that for example > my file '.fstab' 0416cfcbe01474d6f0526ffa5d890813 is stored in > > .../cpool/04/16/0416cfcbe01474d6f0526ffa5d890813 as you can see: > > 8<---------------------------------------------------------------------- > > # ls -l /var/lib/BackupPC/cpool/04/16/0416cfcbe01474d6f0526ffa5d890813 > -r--r--r--. 1 backuppc backuppc 117 Apr 21 14:09 /var/lib/BackupPC/cpool/04/16/0416cfcbe01474d6f0526ffa5d890813 > # > > 8<---------------------------------------------------------------------- > > The file size in the store is only 117 bytes because it's compressed, > if I extract it using BackupPC_zcat I can see the actual file size on > the real storage medium (it was given by the BackupPC_ls output above): > > 8<---------------------------------------------------------------------- > > # su -c '/usr/local/BackupPC/bin/BackupPC_zcat 0416cfcbe01474d6f0526ffa5d890813' backuppc | wc -c > 290 > > 8<---------------------------------------------------------------------- > > HTH - and I hope the longer lines didn't wrap on your mail client. > > [*] This would be /etc/BackupPC/pc if you installed from source, so > I'm guessing that you used a distro package e.g. Debian to install. > In my view, their almost universal insistence on e.g. changing the > names and locations of files and directories tends to make things a > lot more difficult for everybody. > > -- > > 73, > Ged. > > > _______________________________________________ > BackupPC-users mailing list > Bac...@li... > List: https://lists.sourceforge.net/lists/listinfo/backuppc-users > Wiki: https://github.com/backuppc/backuppc/wiki > Project: https://backuppc.github.io/backuppc/ > |
|
From: G.W. H. <ba...@ju...> - 2026-05-05 16:19:36
|
Hi there, On Mon, 4 May 2026, Kenneth Porter wrote: > ... > I'd also recommend choosing a filesystem that allows the mount command > to specify an owner and group for the whole filesystem, so you can move > the media to another host without having to chown everything. My recommendation is to choose a filesystem for reliability and stability, especially for a system which runs BackupPC. -- 73, Ged. |
|
From: G.W. H. <ba...@ju...> - 2026-05-05 16:12:35
|
Hi there, On Mon, 4 May 2026, Adam Pribyl wrote: > After installing the backuppc software (or at least copy of > BackupPC_tarExtract) you should be able to do > > BackupPC_tarExtract -host <hostname> -num <backup_number> -path > /path/to/file > extracted_file You can't just install a copy of BackupPC_tarExtract. It needs to use code from the rest of the BackupPC installation. > Extracting files from cpool is extremely painful - Hopefully the description in my previous post on this subject may make it less painful for you in future. -- 73, Ged. |
|
From: G.W. H. <ba...@ju...> - 2026-05-05 16:10:06
|
Hi there, On Mon, 4 May 2026, bac...@su... wrote: > The host running backuppc on my network had its root drive fail. :( > The actual backup storage was on an external drive which I had > mounted on /var/lib/backuppc. That external drive is still healthy. :) > I remember spending a lot of time getting backuppc to work just > right and dread starting over. My early recollections are similar but I wrote it all in my notebook. It's a habit I developed in my first job - at an explosives research laboratory in 1971. It's a habit I've kept up ever since, and over the years it's sometimes been incredibly useful. > Since the external drive includes the backup for localhost (the > former box running backuppc), it should include a backup for > /etc/backuppc and the config.pl file (I forget which directories I > set for backup for localhost; hopefully I included /etc). Hopefully. Hmmm. > It seems there should be a way I can extract my now-lost > /etc/backuppc/config.pl and other files in that directory. I am > hoping I can install backuppc fresh, then grab the /etc/backuppc > files from the external drive, and swap them in. Is this possible? Hopefully. If you didn't back up the files, they're probably gone. You didn't say what version of BackupPC you're using, so I assume V4. First I suggest that you make a copy of your backup drive on another medium of some sort. Put that copy somewhere safe. Next install a new version of BackupPC the same way that you installed it originally, for example if you used a distro package use the same distro package. At this stage don't try to connect the old backup drive (nor the copy) but simply get the BackupPC system working. You can use any old host on the network as a dummy to start making backups. To get you started set up the configuration to back up one file on it, say every hour or something like that. Once you have the new BackupPC working, shut it down - so that now the BackupPC script isn't running. Now edit the BackupPC 'hosts' file so that it contains the hosts that you had on the original system (if you can't remember them, the hostnames are the names of directories in the directory .../pc/ of the BackupPC backup data store). At this point, if you started BackupPC, it could try to start backing up your hosts. Presumably you won't want it to do that because you'll have individual configurations (for example in /etc/backuppc/pc/ [*]) for each of your hosts, and before starting the backups you'll want to get those configuratons from a backup. You can do that as follows: Replace the 'new' backup store with your 'old' backup store. I'll let you work out how to do that, from what you've written you're obviously capable of doing it. When you've done it, you *could* start BackupPC, but you still don't want to start backups - you still don't have your old configurations. However, you *can* now use the tools provided by BackupPC to get files and directories from the backup data store... 'BackupPC_ls' lets you list the files in any directory from any backup of any share of any host. It gives you an md5sum for each file listed. Here's an example, where I'm logged in as root on the backup server: 8<---------------------------------------------------------------------- # su -c "/usr/local/BackupPC/bin/BackupPC_ls -h alpha -n 2235 -s Config /" backuppc /: -rw-r--r-- 0/0 290 2019-09-26 01:24:33 /.fstab (0416cfcbe01474d6f0526ffa5d890813) drwxr-xr-x 0/0 0 2020-05-07 15:45:30 /.java/ -rw------- 0/0 0 2019-09-26 01:05:24 /.pwd.lock (d41d8cd98f00b204e9800998ecf8427e) drwxr-x--- 0/1002 0 2026-03-26 13:55:12 /BackupPC/ drwxr-xr-x 0/0 0 2025-02-01 16:36:24 /ImageMagick-6/ ... ... 8<---------------------------------------------------------------------- As you can see, BackupPC_ls (normally) has to be run under the ID of the backup user, usually 'backuppc' but of course that's configurable. You give it the host name, in this case 'alpha', the backup number, in this case 2235, the share name, in this case 'Config', and the name of the directory that you want to list, in this case '/'. The output is a bit like that from 'ls -l' but you get md5sums for the files (not for directories). In my example above I've given the full path to the BackupPC_ls utility, it will likely be different in your installation. If you can't remember the share names, they're in the numbered backup directories under the host name directory with 'f' prefixed to them. The file 'backups' in the top level of the host's directory shows you which backup numbers are full and which are incremental. 'BackupPC_zcat' then lets you write the content of any file to stdout. You just give it the md5sum which BackupPC_ls provided for your file. Below I've recovered the file '.fstab' which has the md5sum '0416cf...': 8<---------------------------------------------------------------------- # su -c '/usr/local/BackupPC/bin/BackupPC_zcat 0416cfcbe01474d6f0526ffa5d890813' backuppc proc /proc proc defaults 0 0 PARTUUID=6c586e13-01 /boot vfat defaults 0 2 PARTUUID=6c586e13-02 / ext4 defaults,noatime 0 1 PARTUUID=6c586e13-03 /var ext4 defaults,noatime 0 1 # 8<---------------------------------------------------------------------- You can redirect stdout to a file of course. That's all there is to it. > The names on the files on the external drive are all user-unfriendly. A consequence of BackupPC's way of de-duplicating things is that the files in the backup data store don't have the names that they have on the backed-up storage devices. The name of a file is the md5sum of the files's content, with the added wrinkle that the file in the data store will probably be compressed. The pseudo-filesystems under the BackupPC host name directories index into the data store using md5sums as pointers. The pool files are split into 128 subdirectories, each of which is further split into 128 subdirectories, so that for example my file '.fstab' 0416cfcbe01474d6f0526ffa5d890813 is stored in .../cpool/04/16/0416cfcbe01474d6f0526ffa5d890813 as you can see: 8<---------------------------------------------------------------------- # ls -l /var/lib/BackupPC/cpool/04/16/0416cfcbe01474d6f0526ffa5d890813 -r--r--r--. 1 backuppc backuppc 117 Apr 21 14:09 /var/lib/BackupPC/cpool/04/16/0416cfcbe01474d6f0526ffa5d890813 # 8<---------------------------------------------------------------------- The file size in the store is only 117 bytes because it's compressed, if I extract it using BackupPC_zcat I can see the actual file size on the real storage medium (it was given by the BackupPC_ls output above): 8<---------------------------------------------------------------------- # su -c '/usr/local/BackupPC/bin/BackupPC_zcat 0416cfcbe01474d6f0526ffa5d890813' backuppc | wc -c 290 8<---------------------------------------------------------------------- HTH - and I hope the longer lines didn't wrap on your mail client. [*] This would be /etc/BackupPC/pc if you installed from source, so I'm guessing that you used a distro package e.g. Debian to install. In my view, their almost universal insistence on e.g. changing the names and locations of files and directories tends to make things a lot more difficult for everybody. -- 73, Ged. |
|
From: Kenneth P. <sh...@se...> - 2026-05-04 19:26:26
|
This is why you should regularly save /etc/backuppc to a tarball on the backup media, or mount that directory from the backup media. I'd also recommend choosing a filesystem that allows the mount command to specify an owner and group for the whole filesystem, so you can move the media to another host without having to chown everything. |
|
From: Adam P. <pr...@lo...> - 2026-05-04 13:47:27
|
After installing the backuppc software (or at least copy of BackupPC_tarExtract) you should be able to do BackupPC_tarExtract -host <hostname> -num <backup_number> -path /path/to/file > extracted_file Extracting files from cpool is extremely painful - you have to find a full backup from "backups" file for your host, then hash in pc/ directory, then find the hash in cpool, then extracting the zlib compressed file (with like pigz)... and maybe you can get something useful. For backuppc 4+ this is even worse... On Mon, 4 May 2026, bac...@su... wrote: > > The host running backuppc on my network had its root drive fail. > > The actual backup storage was on an external drive which I had mounted on /var/lib/backuppc > > That external drive is still healthy. I remember spending a lot of time getting backuppc to work just right and dread starting over. Since the external drive includes the backup for localhost (the former box running backuppc), it should include a backup for /etc/backuppc and the config.pl file (I forget which directories I set for backup for localhost; hopefully I included /etc). > > It seems there should be a way I can extract my now-lost /etc/backuppc/config.pl and other files in that directory. I am hoping I can install backuppc fresh, then grab the /etc/backuppc files from the external drive, and swap them in. Is this possible? > > The names on the files on the external drive are all user-unfriendly. > |
|
From: <bac...@su...> - 2026-05-04 12:34:09
|
The host running backuppc on my network had its root drive fail. The actual backup storage was on an external drive which I had mounted on /var/lib/backuppc That external drive is still healthy. I remember spending a lot of time getting backuppc to work just right and dread starting over. Since the external drive includes the backup for localhost (the former box running backuppc), it should include a backup for /etc/backuppc and the config.pl file (I forget which directories I set for backup for localhost; hopefully I included /etc). It seems there should be a way I can extract my now-lost /etc/backuppc/config.pl and other files in that directory. I am hoping I can install backuppc fresh, then grab the /etc/backuppc files from the external drive, and swap them in. Is this possible? The names on the files on the external drive are all user-unfriendly. |
|
From: G.W. H. <ba...@ju...> - 2026-04-24 15:55:34
|
Hi there, Quoting an article here: https://www.openwall.com/lists/oss-security/2026/04/16/2 SentinelOne claims in an AI-generated advisory here: https://www.sentinelone.com/vulnerability-database/cve-2026-41035/ that there is a vulnerability in rsync from versions 3.0.1 to 3.4.1 which is (again, claimed to be) exploitable under certain circumstances. The currently released version of rsync-bpc is based on rsync 3.1.3 and could therefore be expected to suffer from this vulnerability. There is indeed an issue and rsync authors have committed a fixed for it here: https://github.com/RsyncProject/rsync/commit/bb0a8118c2d2ab01140bac5e4e327e5e1ef90c9c The authors state that this fault is not exploitable. By default the configuration of rsync-bpc does not use the '--xattrs' option which the AI thinks allows an exploit. In any case, I've made the same fix in the rsync-bpc code currently in development and it will be released when rsync-bpc 3.4.1.0rc1 comes out (which I hope will be soon:). For those of you who were wondering, at the moment I'm working on an occasional excessive memory usage issue. When that's fixed, rsync-bpc 3.4.1.0rc1 will be ready for release. At the moment I see no need to rush it. -- 73, Ged. |
|
From: G.W. H. <ba...@ju...> - 2026-04-07 15:17:09
|
Hello again, On Tue, 7 Apr 2026, Emmett Culley wrote: > On 4/6/26 7:26 AM, G.W. Haywood wrote: > > On Mon, 6 Apr 2026, Emmett Culley wrote: > > > > ... > >> ... can only restore one file into an existing directory... both > >> rsyncd and rsync methods and neither one can restore multiple files. > > >> > >> ... google ... nothing ... > >> > >> Any suggestion would be appreciated. > > > > The place to start is probably the BackupPC documentation: > > > > https://backuppc.github.io/backuppc/BackupPC.html > > > > especially this part: > > > > 8<---------------------------------------------------------------------- > > > > ? * Flexible restore options. Single files can be downloaded from any > > ??? backup directly from the CGI interface. Zip or Tar archives for > > ??? selected files or directories from any backup can also be downloaded > > ??? from the CGI interface. Finally, direct restore to the client > > ??? machine (using smb or tar) for selected files or directories is also > > ??? supported from the CGI interface. > > > > 8<---------------------------------------------------------------------- > > > > It looks to me like things are behaving as documented. ... > > ... > This used to work flawlessly. Select a directory to restore, click > on the restore buttons, files are restored on the host as expected. Seems like the goal posts just moved - quite a distance. Your original question was "Why doesn't this work?" Your question now seems to have changed to "Why did something which used to work suddenly stop working?" I'm far from convinced that it did, but let's look into it. When I first responded I tried *really* hard not to do the ESR thing and sound condescending or combative. I went against my instincts, which were to ask you exactly what software you're using and how you configured it. I wanted to ask you for operating system details and software version numbers, full copies of your configuration files and relevant extracts from your logs. All that does tend to sound, well, not very welcoming but I'm asking now because there's no better way. > What changed? If something changed, it was you who changed it. You might need some help in figuring out how the change came about, and that's what we're all here for. First you need to tell us exactly what you've done, as until we have full information we can only guess. Right now my best guesses if something really did change are that either you've upgraded some software or you changed some configuration. We need you to tell us what you did. > Why would the support restore via SMB but not rsync. Makes no sense > to me. When a network share is mounted over SMB on a BackupPC host, the files can be copied from the remote share to the local (BackupPC) host using utilities like cp and (plain) tar. Files can be copied from the local host to a remote share by the same means. But without help, utilities like cp and tar can't copy remote files directly; they don't know how. That's why 'rcp' amd 'rsh' came about, and later 'rsync'. The waters unfortunately are already getting a bit muddy, because 'tar' knows how to cheat a little and it can use rsh to read/write remote tar archives but not to read/write remote directories - that's where rsync comes into its own. If you use rsync over ssh, the commands you give effectively create a transparent network connection between the remote host and the local host *purely for that one connection* so that effectively you're doing a *local* copy. The ssh connection just hides the fact that the data, instead of coming from a local file store is coming from a remote one. SMB does such remote access stuff for an entire remote directory, and once mounted it keeps it available for as long as you like so you can use ordinary operating system tools like 'cp' and 'cat' on the files in that directory. That's a complete new level of (persistent) remote access, not like the z-over-ssh thing (where z can be almost anything. for example you can run 'cat' over ssh, or run an 'X' server over ssh, although there it just got a lot more complicated). NFS has a lot of the characteristics of SMB. These aren't always characteristics that you'd want, but that's another story. Suffice it to say that you can copy to/from NFS mounts in much the same way that you can with SMB (if as always of course you have things like permissions set up properly). But if you copy using what I'll call 'plain' rsync - that is, rsync which is *not*running over ssh - then you're not doing the same thing at all. You're asking a bit of code which is network aware to do the data transfer to and from the remote host for you. The rsync utility connects to a remote server and asks for permission for a connection. Simple operating system utilities aren't involved, they don't know how to do it. You can tell rsync to go fetch a whole bunch of files from a remote host, or write them and the entire directory hierarchy to the remote host. *Two* rsync programs run. One of them is on the remote, and of course is controlled by the remote's administrator. These two programs work together, transferring data from one to the other, to do the copying that needs to be done. When you use 'cp' for example only one program is running. It only knows how to talk to the local OS. I glossed over a couple of things, but does that make more sense? In your posts of last July (Github issue #545) you mentioned that depending on the host being backed up you may use rsync over ssh or you may use rsyncd. Does that have a bearing on this? -- 73, Ged. |
|
From: Emmett C. <lst...@we...> - 2026-04-07 13:26:06
|
On 4/7/26 5:16 AM, jbk wrote: > On 4/5/26 12:02 PM, Emmett Culley wrote: >> Recently I attempted to restore a directory and all it's files to a host. All that was restored was the top directory. None of the files or sub-directories were restored. >> >> The log indicated that zero files were restored. >> >> If I attempt to restore a single file, the files gets restored and the log tells me one file was restored. >> >> I am able to restore to a tar file then restore the files in a directory, so I know the files are in the archive. >> >> It doesn't matter what host I try to restore, I can only restore one file into an existing directory. I use both rsyncd and rsync methods and neither one can restore multiple files. >> >> I search via google and can find nothing that clues me into where to look. >> >> Any suggestion would be appreciated. >> >> Emmett >> >> >> >> >> > Sorry not to think of this earlier but it is important to understand. If your definition of a share is just "/", the root file system then you will have no backups of files and folders in shares that are on a separate partition such as /home if it is a mountpoint. It will backkup the folder name that defines the mount but will not descend. You have to define /home as a separate share to backup. This is done to avoid the case where the same network share is mounted on multiple machines or in multiple user directories thus creating unnecessary churn backing up the same directory multiple times. > In all cases the files I need are backed up, and so are in the archive. I only back up machine specific files using defined shares to get only what I need to restore. Emmett |
|
From: Emmett C. <lst...@we...> - 2026-04-07 13:20:57
|
On 4/6/26 6:26 PM, jbk wrote: > On 4/5/26 12:02 PM, Emmett Culley wrote: >> Recently I attempted to restore a directory and all it's files to a host. All that was restored was the top directory. None of the files or sub-directories were restored. >> >> The log indicated that zero files were restored. >> >> If I attempt to restore a single file, the files gets restored and the log tells me one file was restored. >> >> I am able to restore to a tar file then restore the files in a directory, so I know the files are in the archive. >> >> It doesn't matter what host I try to restore, I can only restore one file into an existing directory. I use both rsyncd and rsync methods and neither one can restore multiple files. >> >> I search via google and can find nothing that clues me into where to look. >> >> Any suggestion would be appreciated. >> >> Emmett >> >> >> > I only get digests of replies to the list so you may have gotten some help already. It would be good to know what version of BackupPC you are using. I'm using 4.4.0 on a rocky9.7 server. I just did a test restore of a month old user directory of my laptop to an alternate path on this same machine so as not to mix existing data. In the CGI I browsed to an existing backup # for the machine and found the top level share /home which is a mountpoint and clicked on the square to reveal the user directories below that with the option to select indiividual directories or all. In this case there is only one directory because i'm the only user, but I selected all then clicked Restore. This brought up another dialog menu that allowed me to either restore directly back to the source destination or pick an alternate path which I chose to do. Following the prompts I successfully restored the user directory to the sub-directory as per the log output here: > ....................................................................... > > Trimming / from remoteDir -> /data/jbkdat/tmp > Wrote source file list to /var/lib/BackupPC//pc/lt14/.rsyncFilesFrom69427: / > Running: /usr/bin/rsync_bpc --bpc-top-dir /var/lib/BackupPC/ --bpc-host-name > lt14 --bpc-share-name /home --bpc-bkup-num 48 --bpc-bkup-comp 3 --bpc-bkup-merge > 48/3/4 --bpc-log-level 1 --bpc-attrib-new -e /usr/bin/ssh\ -l\ backuppc > --rsync-path=/usr/bin/sudo\ /usr/bin/rsync --recursive --super > --protect-args --numeric-ids --perms --owner --group -D --times --links > --hard-links --delete --partial --log-format=log:\ %o\ %i\ %B\ %8U,%8G\ %9l\ %f%L > --stats --files-from=/var/lib/BackupPC//pc/lt14/.rsyncFilesFrom69427 / lt14:/data/jbkdat/tmp > This is the rsync child about to exec /usr/bin/rsync_bpc > [ skipped 5526 lines ] > > Number of files: 5,526 (reg: 4,515, dir: 996, link: 13, special: 2) > Number of created files: 5,525 (reg: 4,515, dir: 995, link: 13, special: 2) > Number of deleted files: 0 > Number of regular files transferred: 4,515 > Total file size: 1,532,213,410 bytes > > ............................................................................................. > > To me it seems to work as intended. I would never to a full restore to the source directory because there is too great a chance of applications getting out of sync. Where I have done full restores is when I move users to another machine and it has worked well as long as UID are correctly created. > You can't restore a top level share in one step but as demonstrated you can restore whole subdirectoris below them at least in this small test. > > > -- > Jim KR Thanks Jim That has always been my experience as well. I am on version 4.4.0. I upgraded to 4.x about three years ago and all was working until recently. I am a developer and some times need to restore a clients site (usually wordpress) when they mess it up. This has always worked until now. One response I get suggested that restore shouldn't work with rsync. You response is a relief. I'll keep looking. Emmett |
|
From: jbk <jb...@kj...> - 2026-04-07 12:17:11
|
On 4/5/26 12:02 PM, Emmett Culley wrote: > Recently I attempted to restore a directory and all it's > files to a host. All that was restored was the top > directory. None of the files or sub-directories were > restored. > > The log indicated that zero files were restored. > > If I attempt to restore a single file, the files gets > restored and the log tells me one file was restored. > > I am able to restore to a tar file then restore the files > in a directory, so I know the files are in the archive. > > It doesn't matter what host I try to restore, I can only > restore one file into an existing directory. I use both > rsyncd and rsync methods and neither one can restore > multiple files. > > I search via google and can find nothing that clues me > into where to look. > > Any suggestion would be appreciated. > > Emmett > > > > > Sorry not to think of this earlier but it is important to understand. If your definition of a share is just "/", the root file system then you will have no backups of files and folders in shares that are on a separate partition such as /home if it is a mountpoint. It will backkup the folder name that defines the mount but will not descend. You have to define /home as a separate share to backup. This is done to avoid the case where the same network share is mounted on multiple machines or in multiple user directories thus creating unnecessary churn backing up the same directory multiple times. -- Jim KR |
|
From: jbk <jb...@kj...> - 2026-04-07 01:42:12
|
On 4/5/26 12:02 PM, Emmett Culley wrote: > Recently I attempted to restore a directory and all it's > files to a host. All that was restored was the top > directory. None of the files or sub-directories were > restored. > > The log indicated that zero files were restored. > > If I attempt to restore a single file, the files gets > restored and the log tells me one file was restored. > > I am able to restore to a tar file then restore the files > in a directory, so I know the files are in the archive. > > It doesn't matter what host I try to restore, I can only > restore one file into an existing directory. I use both > rsyncd and rsync methods and neither one can restore > multiple files. > > I search via google and can find nothing that clues me > into where to look. > > Any suggestion would be appreciated. > > Emmett > > > I only get digests of replies to the list so you may have gotten some help already. It would be good to know what version of BackupPC you are using. I'm using 4.4.0 on a rocky9.7 server. I just did a test restore of a month old user directory of my laptop to an alternate path on this same machine so as not to mix existing data. In the CGI I browsed to an existing backup # for the machine and found the top level share /home which is a mountpoint and clicked on the square to reveal the user directories below that with the option to select indiividual directories or all. In this case there is only one directory because i'm the only user, but I selected all then clicked Restore. This brought up another dialog menu that allowed me to either restore directly back to the source destination or pick an alternate path which I chose to do. Following the prompts I successfully restored the user directory to the sub-directory as per the log output here: ....................................................................... Trimming / from remoteDir -> /data/jbkdat/tmp Wrote source file list to /var/lib/BackupPC//pc/lt14/.rsyncFilesFrom69427: / Running: /usr/bin/rsync_bpc --bpc-top-dir /var/lib/BackupPC/ --bpc-host-name lt14 --bpc-share-name /home --bpc-bkup-num 48 --bpc-bkup-comp 3 --bpc-bkup-merge 48/3/4 --bpc-log-level 1 --bpc-attrib-new -e /usr/bin/ssh\ -l\ backuppc --rsync-path=/usr/bin/sudo\ /usr/bin/rsync --recursive --super --protect-args --numeric-ids --perms --owner --group -D --times --links --hard-links --delete --partial --log-format=log:\ %o\ %i\ %B\ %8U,%8G\ %9l\ %f%L --stats --files-from=/var/lib/BackupPC//pc/lt14/.rsyncFilesFrom69427 / lt14:/data/jbkdat/tmp This is the rsync child about to exec /usr/bin/rsync_bpc [ skipped 5526 lines ] Number of files: 5,526 (reg: 4,515, dir: 996, link: 13, special: 2) Number of created files: 5,525 (reg: 4,515, dir: 995, link: 13, special: 2) Number of deleted files: 0 Number of regular files transferred: 4,515 Total file size: 1,532,213,410 bytes ............................................................................................. To me it seems to work as intended. I would never to a full restore to the source directory because there is too great a chance of applications getting out of sync. Where I have done full restores is when I move users to another machine and it has worked well as long as UID are correctly created. You can't restore a top level share in one step but as demonstrated you can restore whole subdirectoris below them at least in this small test. -- Jim KR |
|
From: Emmett C. <lst...@we...> - 2026-04-06 14:37:43
|
On 4/6/26 7:26 AM, G.W. Haywood wrote: > Hi there, > > On Mon, 6 Apr 2026, Emmett Culley wrote: > >> ... >> If I attempt to restore a single file, the files gets restored and >> the log tells me one file was restored. > > :) > >> I am able to restore to a tar file then restore the files in a >> directory, so I know the files are in the archive. > > :) > >> It doesn't matter what host I try to restore, I can only restore one >> file into an existing directory. I use both rsyncd and rsync >> methods and neither one can restore multiple files. >> >> ... google ... nothing ... >> >> Any suggestion would be appreciated. > > The place to start is probably the BackupPC documentation: > > https://backuppc.github.io/backuppc/BackupPC.html > > especially this part: > > 8<---------------------------------------------------------------------- > > * Flexible restore options. Single files can be downloaded from any > backup directly from the CGI interface. Zip or Tar archives for > selected files or directories from any backup can also be downloaded > from the CGI interface. Finally, direct restore to the client > machine (using smb or tar) for selected files or directories is also > supported from the CGI interface. > > 8<---------------------------------------------------------------------- > > It looks to me like things are behaving as documented. It says that > you can restore directories using SMB or tar, but it doesn't say that > you can do it using rsync/rsyncd. My personal feeling is that in any > case rather than trying to restore directly from the Web interface, to > avoid nasty surprises I'd *always* grab an archive and save it in some > temporary directory. Then I could extract whatever files I needed and > put them (carefully) wherever they needed to be, and then check that > the ownerships, permissions etc. are as needed. > This used to work flawlessly. Select a directory to restore, click on the restore buttons, files are restored on the host as expected. What changed? I will read the docs, but I doubt they will explain why restore from the web UI no longer work as it once did. Why would the support restore via SMB but not rsync. Makes no sense to me. Emmett |
|
From: G.W. H. <ba...@ju...> - 2026-04-06 14:26:54
|
Hi there, On Mon, 6 Apr 2026, Emmett Culley wrote: > ... > If I attempt to restore a single file, the files gets restored and > the log tells me one file was restored. :) > I am able to restore to a tar file then restore the files in a > directory, so I know the files are in the archive. :) > It doesn't matter what host I try to restore, I can only restore one > file into an existing directory. I use both rsyncd and rsync > methods and neither one can restore multiple files. > > ... google ... nothing ... > > Any suggestion would be appreciated. The place to start is probably the BackupPC documentation: https://backuppc.github.io/backuppc/BackupPC.html especially this part: 8<---------------------------------------------------------------------- * Flexible restore options. Single files can be downloaded from any backup directly from the CGI interface. Zip or Tar archives for selected files or directories from any backup can also be downloaded from the CGI interface. Finally, direct restore to the client machine (using smb or tar) for selected files or directories is also supported from the CGI interface. 8<---------------------------------------------------------------------- It looks to me like things are behaving as documented. It says that you can restore directories using SMB or tar, but it doesn't say that you can do it using rsync/rsyncd. My personal feeling is that in any case rather than trying to restore directly from the Web interface, to avoid nasty surprises I'd *always* grab an archive and save it in some temporary directory. Then I could extract whatever files I needed and put them (carefully) wherever they needed to be, and then check that the ownerships, permissions etc. are as needed. -- 73, Ged. |
|
From: Emmett C. <lst...@we...> - 2026-04-05 16:20:30
|
Recently I attempted to restore a directory and all it's files to a host. All that was restored was the top directory. None of the files or sub-directories were restored. The log indicated that zero files were restored. If I attempt to restore a single file, the files gets restored and the log tells me one file was restored. I am able to restore to a tar file then restore the files in a directory, so I know the files are in the archive. It doesn't matter what host I try to restore, I can only restore one file into an existing directory. I use both rsyncd and rsync methods and neither one can restore multiple files. I search via google and can find nothing that clues me into where to look. Any suggestion would be appreciated. Emmett |
|
From: G.W. H. <ba...@ju...> - 2026-03-23 14:43:53
|
Hi there, On Sun, 22 Mar 2026, George King wrote: > I'm not sure if it's considered a good approach compared with the > other approaches mentioned but I find DeltaCopy Server very easy and > effective to use on my Windows box.? Rsyncd on BackupPC. Good point. Some observations: 1. For a number of reasons, as far as BackupPC is concerned, running rsync on Windows clients is far superior to using what effectively amount to kludges like SMB. But it *does* mean that you have to load software onto the Windows clients, which you don't have to do if you use smbclient and you can accept some (considerable) tradeoffs, for example the very dodgy file excludes in smbclient: https://github.com/backuppc/backuppc/issues/252 ---------------------------------------------------------------------- 2. As I've never even downloaded Deltacopy I don't feel qualified to comment on its usability or effectiveness. Basically I gather that it is versions (I don't know which versions, and that might be important) of Cygwin and rsync packaged for easy installation on Windows. These tools have commonly been installed by people here on the mailing list for backing up Windows boxes and they seem to do the job. ISTR using them decades ago, most likely Cygwin version 1.something, but I can't remember where nor why. I can't remember any big problems with them, so guess they worked well enough at the time. ---------------------------------------------------------------------- 3. I quickly looked through the Deltacopy documentation just now: http://www.aboutmyip.com/files/DeltaCopyManual.pdf On page 7 of the PDF manual there's a suggestion that you may need to open port 873 on your firewall if you're backing up over the Internet. This recommendation alone makes me question the use of anything from the Deltacopy stable. Nobody should ever do that because (a) it opens up rsync to the world, which is a security nightmare you really don't want to experience, see for example CVE-2024-12084 and (b) the data transfers over the Internet would be plain text, which is of course another security nightmare although possibly not as serious as some criminal getting remote code execution on your rsync server. If you need backups to traverse the Internet, some form of encryption is in my view essential. My preference is for a VPN such as OpenVPN (which I use routinely for remote backups and which provides much more than just a backup path) but you can for example use rsync over ssh by setting it up in the BackupPC configuration. The encryption overhead may be significant. Using a VPN gives you the flexibility to offload that work to machines other than the backup server and its clients if it turns out to be necessary to make the backups proceed more quickly. It could be the difference between having a backup and not having one. ---------------------------------------------------------------------- 4. The documentation gives me little idea which versions of Windows are supported by Deltacopy. The page at http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp gives under "System Requirements" * XP, 2000, 2003, 2008, Vista and Windows 7. We have not tested DeltaCopy on Win9x. * 10 MB hard disk * 64 MB ram * 1 GHz processor or better which looks like it could use an update. It doesn't say for example that Cygwin dropped support for XP in 2015: https://sourceware.org/pipermail/cygwin-announce/2015-August/006392.html that Cygwin dropped 32-bit support altogether in 2022: https://cygwin.com/pipermail/cygwin/2022-November/252542.html nor that Cygwin support of Windows 7, Windows 8, Server 2008 R2 and Server 2012 was dropped in version 3.5 early in 2024: https://cygwin.com/pipermail/cygwin-announce/2024-February/011524.html ---------------------------------------------------------------------- 5. One of the issues people come up against with Cygwin/rsync is that backing up open files can be problematic because Windows restricts access to them. If it's essential to back up open files then people usually get around this using "shadow copies" of the data, see e.g. http://www.michaelstowe.com/backuppc/ which links to http://www.goodjobsucking.com/?p=62 giving a very clear account of one implementation. This seems to be a problem that has been fairly well put to bed. ---------------------------------------------------------------------- Other observations invited and very welcome. HTH -- 73, Ged. |
|
From: George K. <ge...@ki...> - 2026-03-21 19:38:09
|
I'm not sure if it's considered a good approach compared with the other approaches mentioned but I find DeltaCopy Server very easy and effective to use on my Windows box. Rsyncd on BackupPC. George On 21/03/2026 15:27, Dan Johansson via BackupPC-users wrote: > Hello Falko and Ged, > > > Thanks for your feedback, that was the info I was looking for. > I can now try to configure my backuppc to backup my sole wintendo > laptop as well. π > Lets see how it goes... > > > Regards, > > Dan > > > On 18-03-2026 18:32, Falko Trojahn wrote: >> Hello Dan, >> >> Dan Johansson via BackupPC-users schrieb am 18.03.26 um 16:02: >> >>> Some time ago, someone posted a script (or part thereof) for backing >>> up a Windows 11 host with BackupPC using native Windows tools. >>> >>> I can not find the post/script among my emails π >>> >>> Could someone please tell me where I can find those scripts? >>> >> >> may be you mean this? >> >> >> > That's what I did more than five years ago, too - but we didn't use >> > samba client for Windows backups with BackupPC, but this: >> > >> > https://www.michaelstowe.com/backuppc/ >> > >> > and I remember it working very well. Especially that it did >> > use shadow copies. I think we used Win10 these days. >> > >> > But later we switched to Urbackup client. >> > >> > If you don't have recent Windows versions at hand ... >> > Usually it is possible to get some Windows developer edition VMs. >> > Ah, we did use this: https://github.com/xdissent/ievms/ >> > >> > Maybe some newer fixes are in the forks or pull requests like: >> > https://github.com/xdissent/ievms/pull/343 >> > >> > Just my 2Β’ >> > Falko >> >> Link to the archive is here: >> https://sourceforge.net/p/backuppc/mailman/backuppc-users/?viewmonth=202509&viewday=28 >> >> >> Another thread about Win 11 Backup was here: >> https://sourceforge.net/p/backuppc/mailman/backuppc-users/thread/DM6PR20MB344338A189BF8697A04E4CC8DD242%40DM6PR20MB3443.namprd20.prod.outlook.com/#msg58747558 >> >> >> Best regards >> Falko >> >> |
|
From: Dan J. <bac...@dm...> - 2026-03-21 15:27:32
|
Hello Falko and Ged, Thanks for your feedback, that was the info I was looking for. I can now try to configure my backuppc to backup my sole wintendo laptop as well. π Lets see how it goes... Regards, Dan On 18-03-2026 18:32, Falko Trojahn wrote: > Hello Dan, > > Dan Johansson via BackupPC-users schrieb am 18.03.26 um 16:02: > >> Some time ago, someone posted a script (or part thereof) for backing up a Windows 11 host with BackupPC using native Windows tools. >> >> I can not find the post/script among my emails π >> >> Could someone please tell me where I can find those scripts? >> > > may be you mean this? > > > > That's what I did more than five years ago, too - but we didn't use > > samba client for Windows backups with BackupPC, but this: > > > > https://www.michaelstowe.com/backuppc/ > > > > and I remember it working very well. Especially that it did > > use shadow copies. I think we used Win10 these days. > > > > But later we switched to Urbackup client. > > > > If you don't have recent Windows versions at hand ... > > Usually it is possible to get some Windows developer edition VMs. > > Ah, we did use this: https://github.com/xdissent/ievms/ > > > > Maybe some newer fixes are in the forks or pull requests like: > > https://github.com/xdissent/ievms/pull/343 > > > > Just my 2Β’ > > Falko > > Link to the archive is here: > https://sourceforge.net/p/backuppc/mailman/backuppc-users/?viewmonth=202509&viewday=28 > > Another thread about Win 11 Backup was here: > https://sourceforge.net/p/backuppc/mailman/backuppc-users/thread/DM6PR20MB344338A189BF8697A04E4CC8DD242%40DM6PR20MB3443.namprd20.prod.outlook.com/#msg58747558 > > Best regards > Falko > > -- Dan Johansson, *************************************************** This message is printed on 100% recycled electrons! *************************************************** |
|
From: G.W. H. <ba...@ju...> - 2026-03-19 13:33:29
|
Hi there, On Thu, 19 Mar 2026, Dan Johansson wrote: > Some time ago, someone posted a script (or part thereof) for backing > up a Windows 11 host with BackupPC using native Windows tools. > > I can not find the post/script among my emails ? > > Could someone please tell me where I can find those scripts? I'm not sure that you've given us enough information to identify any particular post, but in addition to the threads to which others have pointed (from March 2024 and September 2025) in July 2025 there was: https://sourceforge.net/p/backuppc/mailman/message/59205241/ In addition there are a (very) few articles in the BackupPC Wiki at https://github.com/backuppc/backuppc/wiki for example https://github.com/backuppc/backuppc/wiki/How-to-backup-a-Windows-machine-using-rsync-over-ssh-using-ssh-keys-for-authentication If you find ways of backing up Win 11 which are easier and/or better than anything in the documentatiion so far mentioned in this thread, I'd be grateful if you could either add something to the Wiki or drop us a line on the list to let us know what information could usefully be added. I'd like to think that the Wiki will eventually become the go-to resource for things like this. -- 73, Ged. |
|
From: Paul L. <pau...@gm...> - 2026-03-18 19:52:29
|
Exactly what I did. Worked without any issues. Paul On 18/03/2026 17:46, Falko Trojahn via BackupPC-users wrote: > Hello Matthias, > > Matthias--- via BackupPC-users schrieb am 09.03.26 um 18:48: >> chown -R backuppc:backuppc /var/log/backuppc was the missing linkπ οΈ >> >> Sorry for asking a few minutes too early. > I'm a bit late, but ... next time: for experienced users > it should be possible, too, to just change the uid/gid > of the backuppc user in the new system's /etc/passwd and /etc/group > files. > Then the chown is just needed for the config files in e.g. > /etc/backuppc. AFAIR I did it like this some time ago. > > Since, for large volumes of the backuppc pool, the chown could take > some time. And it is not always a good idea to touch all the files in > the pool. > > F. > > > > _______________________________________________ > BackupPC-users mailing list > Bac...@li... > List: https://lists.sourceforge.net/lists/listinfo/backuppc-users > Wiki: https://github.com/backuppc/backuppc/wiki > Project: https://backuppc.github.io/backuppc/ |
|
From: Falko T. <ne...@tr...> - 2026-03-18 17:52:11
|
Hello Dan, Dan Johansson via BackupPC-users schrieb am 18.03.26 um 16:02: > Some time ago, someone posted a script (or part thereof) for backing up > a Windows 11 host with BackupPC using native Windows tools. > > I can not find the post/script among my emails π > > Could someone please tell me where I can find those scripts? > may be you mean this? > That's what I did more than five years ago, too - but we didn't use > samba client for Windows backups with BackupPC, but this: > > https://www.michaelstowe.com/backuppc/ > > and I remember it working very well. Especially that it did > use shadow copies. I think we used Win10 these days. > > But later we switched to Urbackup client. > > If you don't have recent Windows versions at hand ... > Usually it is possible to get some Windows developer edition VMs. > Ah, we did use this: https://github.com/xdissent/ievms/ > > Maybe some newer fixes are in the forks or pull requests like: > https://github.com/xdissent/ievms/pull/343 > > Just my 2Β’ > Falko Link to the archive is here: https://sourceforge.net/p/backuppc/mailman/backuppc-users/?viewmonth=202509&viewday=28 Another thread about Win 11 Backup was here: https://sourceforge.net/p/backuppc/mailman/backuppc-users/thread/DM6PR20MB344338A189BF8697A04E4CC8DD242%40DM6PR20MB3443.namprd20.prod.outlook.com/#msg58747558 Best regards Falko |
|
From: Falko T. <ne...@tr...> - 2026-03-18 17:47:09
|
G.W. Haywood schrieb am 18.03.26 um 14:41: > The fuss about malicious changes to code on Github has probably not > escaped your attention. > > ... > > In addition I have also scanned these repositories for indicators of > compromise that have been published in some of the incident reports, > plus a couple more of my own devising. > > You'll be pleased to know that no indicator of compromise was found. > Thank you very much for your attention to these threats and that you care about security of this application. I really appreciate this. Have a good time F. |
|
From: Falko T. <ne...@tr...> - 2026-03-18 17:46:21
|
Hello Matthias, Matthias--- via BackupPC-users schrieb am 09.03.26 um 18:48: > chown -R backuppc:backuppc /var/log/backuppc was the missing linkπ οΈ > > Sorry for asking a few minutes too early. I'm a bit late, but ... next time: for experienced users it should be possible, too, to just change the uid/gid of the backuppc user in the new system's /etc/passwd and /etc/group files. Then the chown is just needed for the config files in e.g. /etc/backuppc. AFAIR I did it like this some time ago. Since, for large volumes of the backuppc pool, the chown could take some time. And it is not always a good idea to touch all the files in the pool. F. |
|
From: Dan J. <bac...@dm...> - 2026-03-18 15:02:53
|
Hello, Some time ago, someone posted a script (or part thereof) for backing up a Windows 11 host with BackupPC using native Windows tools. I can not find the post/script among my emails π Could someone please tell me where I can find those scripts? Regards, -- Dan Johansson, *************************************************** This message is printed on 100% recycled electrons! *************************************************** |