From: Craig B. <cba...@us...> - 2010-03-11 17:24:05
|
shalauras writes: > I'm making backups from some servers daily, but i have a problem with one server. > The client take up around 50G, but almost all in one directory(49G) > > When i try make a backup, this give me an error when trying to copy the directory with 49G > "Out of memory during "large" request .... /usr/...backuppc/.../fileZIO.pm" Can you include the full error? Also, what versions of zlib and Compress::Zlib are you using? How many files are in this directory? BackupPC stores all the file attribute information for each directory in a single file, and it needs to be able to decompress and fit that information in memory. However, it should only be at most a few 100 bytes per file, so one directory would have to have a huge number of files for this to be a problem. Another possibility is that some compressed file is corrupted and it is causing uncompress to break. Craig |