From: Mingus D. <sho...@gm...> - 2010-10-07 21:03:48
|
All, I am running Bacula 5.0.1 on Solaris 10 x86. I'm currently running MySQL 4.1.22 for the database server. I do plan on upgrading to a compatible version of MySQL 5, but migrating to PostgreSQL isn't an option at this time. I am trying to backup to tape a very large number of files for a client. While the data size is manageable at around 2TB, the number of files is incredibly large. The first of the jobs had 27 million files and initially failed because the batch table became "Full". I changed the myisam_data_pointer size to a value of 6 in the config. This job was then able to run successfully and did not take too long. I have another job which has 42 million files. I'm not sure what that equates to in rows that need to be inserted, but I can say that I've not been able to successfully run the job, as it seems to hang for over 30 hours in a "Dir inserting attributes" status. This causes other jobs to backup in the queue and once canceled I have to restart Bacula. I'm looking for way to boost performance of MySQL or Bacula (or both) to get this job completed. Thanks, Shon |