From: dan <dan...@gm...> - 2009-01-11 17:03:50
|
"Trivial" amounts become important amounts when large numbers are involved. I am not sure how long the memory remains allocated but at 44kB per file, this will add up quickly. Requardless, at the space requirements of the OP and the drives they are using, just skip compression. I would mention that compression can in some circumstances IMPROVE disk performance. If you have a fast CPU and slow disk it would be faster to compress the data so that less writes to the disk. Also, since they will have a fairly slow growth of data, the first backup is likely to take quite a while but the following backups will be pretty quick if using rsync. that 512MB of RAM is more than enough. just watch your memory use with 'free' while doing a backup and you will see the usage. If you dont run a GUI on the linux install you will likely never use more that 128MB of RAM. Your linux system will use the balance for cache but that isnt 'required' usage and is flushed as needed. On Sun, Jan 11, 2009 at 4:36 AM, <tm...@ob...> wrote: > dan <dan...@gm...> wrote on 01/10/2009 11:50:57 PM: > > > In reply to tmassey's comments. IF you do NOT use compression your > > RAM requirements shrink significantly along with your CPU needs. > > Remember, BackupPC is all about I/O. If you take the one thing that > > really needs a fast cpu away (compression) you are only I/O bound. > > I have a philosophical problem with compression and backups (my goal is to > have my backups as simple as possible), so I've never used compression. So > take what I'm going to say with a grain of salt. But: > > My understanding of the zlib (gzip) algorithm is that it requires a fixed > (and small) amount of memory for both compression and decompression. > Compression requires more memory than decompression, but both of them are > trivial amounts by today's standards (measured in *kilobytes*). Therefore, > compression should not affect memory requirements in a measurable way. > (Reference: http://www.gzip.org/zlib/zlib_tech.html) > > CPU, of course, is obviously needed, but I mentioned that in the original > mail. > > In short, the only thing that requires RAM is the file list, whether > compression is used or not. Like I said, I've done servers with ~250GB of > data and half a million files with 512MB of RAM, of which more than 100MB > was still being used as cache, and exactly zero bytes of swap. > > So unless you're going to have servers with literally millions of files > (e.g. mail or Usenet servers that store each message as a separate file), > 512MB of RAM is plenty. It's not like disk cache is going to help you on > your backup server: it's merely streaming data to disk. 100MB of disk > cache is *plenty* to hold the metadata it'll need... > > Tim Massey > > > > ------------------------------------------------------------------------------ > Check out the new SourceForge.net Marketplace. > It is the best place to buy or sell services for > just about anything Open Source. > http://p.sf.net/sfu/Xq1LFB > _______________________________________________ > BackupPC-users mailing list > Bac...@li... > List: https://lists.sourceforge.net/lists/listinfo/backuppc-users > Wiki: http://backuppc.wiki.sourceforge.net > Project: http://backuppc.sourceforge.net/ > |