From: Les M. <le...@fu...> - 2006-11-27 02:12:20
|
On Sun, 2006-11-26 at 19:48, Adam Goryachev wrote: > I've recently installed backuppc on my backup server, and slowly started > to try and backup various machines. I started with my windows laptop via > rsyncd, and recently added a couple of linux servers via rsync over ssh. > > I've had a lot of problems attempting to backup the windows machine, and > sort of decided that it might just be a windows related problem, and > asked a few questions on this list for help (and not had a reply yet). > > Anyway, while trying to debug why it was also having problems backing up > a linux server, I noticed that the backup processes where using a LOT of > memory (RAM), and that they seemed to be dying due to insufficient memory. > Out of Memory: Killed process 7339 (BackupPC_dump). > > So, I am now wandering how to calculate the memory requirements of > backuppc. The machine I am using only has 128MB RAM, but I assume that > we don't need enough memory to hold the contents of the largest file, or > else most people wouldn't be able to backup large files (ie, 2GB or more)... > > Is it based on the number of files needing to be backup up, or some > combination? perhaps a certain amount of memory for each 'block' of the > file that is being transferred? Rsync transfers the entire directory tree to the remote with a certain amount of overhead taken in RAM for each file. I'm not sure about the exact size, but you can work around the problem by breaking large runs up into separate filesystems or subdirectories, or switching to tar or smb methods. -- Les Mikesell le...@fu... |