From: Jonathan H. <jh...@sk...> - 2012-11-29 22:23:35
|
If i have say, 2 data base servers, can i set bacula to ensure they are not being backed up at the same time? Even if they are the last 2 jobs running, id like to not back them both up simultaneously. Is this possible? Thanks, jonathan ________________________________ This is a PRIVATE message. If you are not the intended recipient, please delete without copying and kindly advise us by e-mail of the mistake in delivery. NOTE: Regardless of content, this e-mail shall not operate to bind SKOPOS to any order or other contract unless pursuant to explicit written agreement or government initiative expressly permitting the use of e-mail for such purpose. |
From: Dan L. <da...@la...> - 2012-11-29 22:48:44
|
On Nov 29, 2012, at 4:56 PM, Jonathan Horne wrote: > If i have say, 2 data base servers, can i set bacula to ensure they are not being backed up at the same time? Even if they are the last 2 jobs running, id like to not back them both up simultaneously. > First guess: Set them as different priorities. > Is this possible? Read up on Job Priority. -- Dan Langille - http://langille.org |
From: Rodrigo R. B. <rod...@gm...> - 2012-12-05 11:27:04
|
2012/11/29 Dan Langille <da...@la...> > On Nov 29, 2012, at 4:56 PM, Jonathan Horne wrote: > > If i have say, 2 data base servers, can i set bacula to ensure they are > not being backed up at the same time? Even if they are the last 2 jobs > running, id like to not back them both up simultaneously. > > First guess: Set them as different priorities. > That's what I'd do, if I'm not mistaken, the default Priority is 10, so I'd set the priority for those 2 DB servers with 9 and 8 respectively (or 11 and 12, depending on when do you these jobs to run -- before or after everyone else). |
From: ccspro <bac...@ba...> - 2012-12-11 21:47:32
|
2012/11/29 Dan Langille <dan < at > langille.org (dan < at > langille.org)> On Nov 29, 2012, at 4:56 PM, Jonathan Horne wrote: If i have say, 2 data base servers, can i set bacula to ensure they are not being backed up at the same time?  Even if they are the last 2 jobs running, id like to not back them both up simultaneously. First guess: Set them as different priorities. That's what I'd do, if I'm not mistaken, the default Priority is 10, so I'd set the priority for those 2 DB servers with 9 and 8 respectively (or 11 and 12, depending on when do you these jobs to run -- before or after everyone else). ==== And make sure your Catalog backup has the lowest priority (say 99) so it will be completed after all Jobs are done for the day. +---------------------------------------------------------------------- |This was sent by cc...@ho... via Backup Central. |Forward SPAM to ab...@ba.... +---------------------------------------------------------------------- |
From: Phil S. <al...@me...> - 2012-12-12 00:11:45
|
On 12/11/12 16:47, ccspro wrote: > And make sure your Catalog backup has the lowest priority (say 99) so > it will be completed after all Jobs are done for the day. Personally, I don't bother with a catalog backup job separate from my DB backups, since my Bacula catalog is by far the largest schema in my DB anyway (96% of both total application data and total application data rows). I just back up the DB last and have redundant, replicated DB servers. -- Phil Stracchino, CDK#2 DoD#299792458 ICBM: 43.5607, -71.355 al...@ca... al...@me... ph...@co... Renaissance Man, Unix ronin, Perl hacker, SQL wrangler, Free Stater It's not the years, it's the mileage. |
From: Dan L. <da...@la...> - 2012-12-12 00:44:52
|
On Dec 11, 2012, at 7:11 PM, Phil Stracchino wrote: > On 12/11/12 16:47, ccspro wrote: >> And make sure your Catalog backup has the lowest priority (say 99) so >> it will be completed after all Jobs are done for the day. > > Personally, I don't bother with a catalog backup job separate from my DB > backups, since my Bacula catalog is by far the largest schema in my DB > anyway (96% of both total application data and total application data > rows). I just back up the DB last and have redundant, replicated DB > servers. I dump my DB to a plain text file daily. I then rsync that file, and the *.conf files to three others servers. Two offsite, one on-site. That file also gets backed up. -- Dan Langille - http://langille.org |
From: Phil S. <al...@me...> - 2012-12-12 02:34:35
|
On 12/11/12 19:44, Dan Langille wrote: > > On Dec 11, 2012, at 7:11 PM, Phil Stracchino wrote: > >> On 12/11/12 16:47, ccspro wrote: >>> And make sure your Catalog backup has the lowest priority (say 99) so >>> it will be completed after all Jobs are done for the day. >> >> Personally, I don't bother with a catalog backup job separate from my DB >> backups, since my Bacula catalog is by far the largest schema in my DB >> anyway (96% of both total application data and total application data >> rows). I just back up the DB last and have redundant, replicated DB >> servers. > > I dump my DB to a plain text file daily. I then rsync that file, and the *.conf > files to three others servers. Two offsite, one on-site. Yup. Dump the DB to text that's retained for 15 days, back up the dumps directory, compress and archive offsite. Including the Bacula .conf files with the DB dumps is a good idea though. I'll have to add that to my system. -- Phil Stracchino, CDK#2 DoD#299792458 ICBM: 43.5607, -71.355 al...@ca... al...@me... ph...@co... Renaissance Man, Unix ronin, Perl hacker, SQL wrangler, Free Stater It's not the years, it's the mileage. |
From: Josh F. <jf...@pv...> - 2012-12-12 16:12:17
|
On 12/11/2012 7:44 PM, Dan Langille wrote: > On Dec 11, 2012, at 7:11 PM, Phil Stracchino wrote: > >> On 12/11/12 16:47, ccspro wrote: >>> And make sure your Catalog backup has the lowest priority (say 99) so >>> it will be completed after all Jobs are done for the day. >> Personally, I don't bother with a catalog backup job separate from my DB >> backups, since my Bacula catalog is by far the largest schema in my DB >> anyway (96% of both total application data and total application data >> rows). I just back up the DB last and have redundant, replicated DB >> servers. > I dump my DB to a plain text file daily. I then rsync that file, and the *.conf > files to three others servers. Two offsite, one on-site. > > That file also gets backed up. I run DB, as well as bacula-dir and bacula-sd, as KVM VMs in a 2-node Corosync/Pacemaker cluster, and so use a different approach. Each VM is installed on a separate DRBD device and uses local LVM volumes on each node for /tmp, spool area, etc. Periodically, the VMs are taken down and the DRBD devices dd'd to, and the VM XML libvirt definitions copied to, USB hard drives, one copy in a local fire safe and an offsite copy. These VM backups correspond to the offsite tapes and catalog as of the date/time the offsite backups are written. So, for true disaster recovery, (ie. fire, tornado, etc.), it is simply a matter of dd'ing the VM backups to DRBD devices on the new hardware and recreating the VMs using the backed up libvirt XML files. The catalog and everything else will be current with the offsite backups. For recovery in the event of less disastrous problems, like corruption of the database, I daily copy a text dump of the DB to another local server using NFS. |
From: ccspro <bac...@ba...> - 2012-12-12 00:37:20
|
I agree with you here - replication is the better way to go but not everyone does that. In the next few weeks I'll be doing the same thing as you since I've finally decreased my DB size (started to become more strict about client systems and keeping things clean! while at the same time upgrading everything to 5.2+ [5.2.12 on the director/sd side]) Also I piggy back on the DB dump script to issue reports when run - so if I come into the office and see the report(s) didn't come in ... time to investigate :P On 12/11/12 16:47, ccspro wrote: And make sure your Catalog backup has the lowest priority (say 99) so it will be completed after all Jobs are done for the day. Personally, I don't bother with a catalog backup job separate from my DB backups, since my Bacula catalog is by far the largest schema in my DB anyway (96% of both total application data and total application data rows). I just back up the DB last and have redundant, replicated DB servers. -- Phil Stracchino, CDK#2 DoD#299792458 ICBM: 43.5607, -71.355 alaric < at > caerllewys.net alaric < at > metrocast.net phil < at > co.ordinate.org Renaissance Man, Unix ronin, Perl hacker, SQL wrangler, Free Stater It's not the years, it's the mileage. ------------------------------------------------------------------------------ LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial Remotely access PCs and mobile devices and provide instant support Improve your efficiency, and focus on delivering more value-add services Discover what IT Professionals Know. Rescue delivers http://p.sf.net/sfu/logmein_12329d2d _______________________________________________ Bacula-users mailing list Bacula-users < at > lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users +---------------------------------------------------------------------- |This was sent by cc...@ho... via Backup Central. |Forward SPAM to ab...@ba.... +---------------------------------------------------------------------- |
From: ccspro <bac...@ba...> - 2012-12-12 02:59:16
|
Or better yet setup a SVN or GIT repository for your config file changes? Just an idea. We do this for some of our router,switch and network configs. ==================================== Yup. Dump the DB to text that's retained for 15 days, back up the dumps directory, compress and archive offsite. Including the Bacula .conf files with the DB dumps is a good idea though. I'll have to add that to my system. +---------------------------------------------------------------------- |This was sent by cc...@ho... via Backup Central. |Forward SPAM to ab...@ba.... +---------------------------------------------------------------------- |