Bacula-users

Re: [Bacula-users] Catalog backup while job running?

2012-04-02 13:51:53
Subject: Re: [Bacula-users] Catalog backup while job running?
From: Stephen Thompson <stephen AT seismo.berkeley DOT edu>
To: Phil Stracchino <alaric AT metrocast DOT net>
Date: Mon, 02 Apr 2012 10:49:38 -0700
On 02/06/2012 02:45 PM, Phil Stracchino wrote:
> On 02/06/2012 05:02 PM, Stephen Thompson wrote:
>> So, my question is whether anyone had any ideas about the feasibility of
>> getting a backup of the Catalog while a single "long-running" job is
>> active?  This could be in-band (database dump) or out-of-band (copy of
>> database directory on filesystem or slave database server taken
>> offline).  We are using MySQL, but would not be opposed to switching to
>> PostGRES if it buys us anything in this regard.
>>
>> What I wonder specifically (in creating my own solution) is:
>> 1) If I backup the MySQL database directory, or sync to a slave server
>> and create a dump from that, am I simply putting the active
>> "long-running" job records at risk of being incoherent, or am I risking
>> the integrity of the whole Catalog in doing so?
>> 2) If I attempt a dump of the MySQL catalog and lock the tables while
>> doing so, what will the results be to the active "long-running" job?
>> Will it crap out or simply pause and wait for database access when it
>> needs to read/write to the database?  And if so, how long will it wait?
>
> Stephen,
> Three suggestions here.
>
> Route 1:
> Set up a replication slave and perform your backups from the slave.  If
> the slave falls behind the master while you're dumping the DB, you don't
> really care all that much.  It doesn't impact your production DB.
>
> Route 2:
> If you're not using InnoDB in MySQL, you should be by now.  So look into
> the --skip-opt and --single-transaction options to mysqldump to dump all
> of the transactional tables consistently without locking them.  Your
> grant tables will still need a read lock, but hey, you weren't planning
> on rewriting your grant tables every day, were you...?
>


Well, we've made the leap from MyISAM to InnoDB, seems like we win on 
transactions, but lose on read speed.

That aside, I'm seeing something unexpected.  I am now able to 
successfully run jobs while I use mysqldump to dump the bacula Catalog, 
except at the very end of the dump there is some sort of contention.  A 
few of my jobs (3-4 out of 150) that are attempting to despool 
attritbutes at the tail end of the dump yield this error:

Fatal error: sql_create.c:860 Fill File table Query failed: INSERT INTO 
File (FileIndex, JobId, PathId, FilenameId, LStat, MD5, DeltaSeq) SELECT 
batch.FileIndex, batch.JobId, Path.PathId, 
Filename.FilenameId,batch.LStat, batch.MD5, batch.DeltaSeq FROM batch 
JOIN Path ON (batch.Path = Path.Path) JOIN Filename ON (batch.Name = 
Filename.Name): ERR=Lock wait timeout exceeded; try restarting transaction

I have successful jobs before and after this 'end of the dump' timeframe.

It looks like I might be able to "fix" this by increasing my 
innodb_lock_wait_timeout, but I'd like to understand WHY I need to 
icnrease it.  Anyone know what's happening at the end of a dump like 
this that would cause the above error?

mysqldump -f --opt --skip-lock-tables --single-transaction bacula 
 >>bacula.sql

Is it the commit on this 'dump' transaction?

thanks!
Stephen





> Route 3:
> Look into an alternate DB backup solution like mydumper or Percona
> XtraBackup.
>
> Route 4:
> Do you have the option of taking a snapshot of your MySQL datadir and
> backing up the snapshot?  This can be viable if you have a small DB and
> fast copy-on-write snapshots.  (It's the technique I'm using at the
> moment, though I'm considering a switch to mydumper.)
>
>


-- 
Stephen Thompson               Berkeley Seismological Laboratory
stephen AT seismo.berkeley DOT edu    215 McCone Hall # 4760
404.538.7077 (phone)           University of California, Berkeley
510.643.5811 (fax)             Berkeley, CA 94720-4760

------------------------------------------------------------------------------
This SF email is sponsosred by:
Try Windows Azure free for 90 days Click Here 
http://p.sf.net/sfu/sfd2d-msazure
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users