From: Michał B. <mic...@ge...> - 2010-06-16 11:50:21
|
> > We are plan to use moosefs at our product environment as the storage > > of our online photo service. > > > > But since we got that master server store all the metadata at memory, > > but we will store for about a hundred million photo files, so I wonder > > how much memory should prepare for the master server? And how to > > calculate this number? > According to FAQ, http://www.moosefs.org/moosefs-faq.html#cpu , at gemius, 8GB > ram is used by master for 25 millions files. So, for a hundred millions, you'd > need 32GB. [MB] Yes, that's right 32GB would be enough to keep metadata in RAM but in order that the whole system works smoothly you would need preferably 40-48 GB of RAM in the master server. > > If the memory is not enough, what will happened with master server's? > I guess it'll swap. [MB] The performance of the whole system would be substantially lower. > > And I still wonder the performance about master server when use > > moosefs to store a hundred million photo files? Anyone can give me > > some more information? > I've no experience with such a large setup. I guess memory caching used to > prevent bottleneck on master will still do the trick. [MB] When you would have 40-48GB of RAM in the master server the system would have no problems with performance or stability. If you need any further assistance please let us know. Kind regards Michał Borychowski MooseFS Support Manager _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ Gemius S.A. ul. Wołoska 7, 02-672 Warszawa Budynek MARS, klatka D Tel.: +4822 874-41-00 Fax : +4822 874-41-01 > regards, > -- > Laurent Wandrebeck > HYGEOS, Earth Observation Department / Observation de la Terre > Euratechnologies > 165 Avenue de Bretagne > 59000 Lille, France > tel: +33 3 20 08 24 98 > http://www.hygeos.com > GPG fingerprint/Empreinte GPG: F5CA 37A4 6D03 A90C 7A1D 2A62 54E6 EF2C D17C > F64C |