From: Les M. <les...@gm...> - 2013-11-15 22:59:16
|
On Thu, Nov 14, 2013 at 7:23 AM, Henry Burroughs <hbu...@st...> wrote: > I am a new BackupPc user and very impressed with it. I am now trying > to figure out if I should just let BackupPC backup all files and > folders on my Windows and Linux boxes, or just subsets. > > One constraint is I need to keep the cpool manageable to replicate it > offsite (I am aware of the dangers with the pc directory, so I am > planning on using the BackupPC tar command to create gz files of the > PC directory. > > Local disk space is not an issue, but bandwidth and replication time > are. Thanks! What is the actual plan for the offsite set? The 'really simple' solution to this problem is to run an independent instance of backuppc from a different location, perhaps using openvpn to get remote access to network. You might need to bring the server onsite for the initial full runs but subsequently, running over rsync is very efficient. If you are planning to have one instance and rsync the pool, there are some size constraints - but I'm not sure if anyone knows exactly what is 'too big' or 'too many links'. -- Les Mikesell les...@gm... |