From: Bill M. <wm...@co...> - 2006-06-30 20:34:04
|
In response to jer...@un...: > I have a particular host/job that has roughly 1.2 million files @ > 18GB's... that's one full backup. > > When running a restore, it took almost 28 hours for bacula to finish the > "building directory for jobid blah" and using 100% of one cpu ... once > this is done the actual file selction and restore operations are quite > fast and normal. Note I was using the #5 option "most recent full backup"... > > My question is how can this initial building of the directory tree be > sped up? A million files doesn't seem like alot (I've seen folks post in > the archives here of 6million file jobs).... You neither say what kind of database back end you're using, nor to you provide information on memory usage, IO or other system utilization. These factors play a big part in this kind of performance. -- Bill Moran Collaborative Fusion Inc. |