Bacula-users

Re: [Bacula-users] Bacula MySQL Catalog binlog restore

2012-04-05 18:21:11
Subject: Re: [Bacula-users] Bacula MySQL Catalog binlog restore
From: Joe Nyland <joe AT joenyland.co DOT uk>
To: Bacula Users <Bacula-users AT lists.sourceforge DOT net>
Date: Thu, 5 Apr 2012 23:19:00 +0100
On 5 Apr 2012, at 22:37, Stephen Thompson wrote:

> On 04/05/2012 02:27 PM, Joe Nyland wrote:
>> Hi,
>> 
>> I've been using Bacula for a while now and I have a backup procedure in 
>> place for my MySQL databases, where I perform a full (dump) backup nightly, 
>> then incremental (bin log) backups every hour through the day to capture 
>> changes.
>> 
>> I basically have a script which I have written which is run as a 
>> 'RunBeforeJob' from backup and runs either a mysqldump if the backup level 
>> is full, or flushes the bin logs if the level is incremental.
>> 
>> I'm in the process of performing some test restores from these backups, as I 
>> would like to know the procedure is working correctly.
>> 
>> I have no issue restoring the files from Bacula, however I'm having some 
>> issues restoring my catalog MySQL database from the binary logs created by 
>> MySQL. Specifically, I am getting messages like:
>> 
>>      ERROR 1146 (42S02) at line 105: Table 'bacula.batch' doesn't exist
>> 
>> when I try to replay my log files against the database after it's been 
>> restore from the dump file. As far as I know the batch table is a temporary 
>> table created when inserting file attributes into the catalog during/after a 
>> backup job. I would have hoped, however, the creation of this table would 
>> have been included in either my database/earlier in my bin log.
>> 
>> I believe this may be related to another thread on the list at the moment 
>> titled "Catalog backup while job running?" as this is, in effect what I am 
>> doing - a full database dump whilst other jobs are running, but my reason 
>> for creating a new thread is that I am not getting any errors in my backup 
>> jobs, as the OP of the other thread is - I'm simply having issues rebuilding 
>> my database after restoring the said full dump.
>> 
>> I would like to know if anyone is currently backing up their catalog 
>> database in such a way, and if so how they are overcoming this issue when 
>> restoring. My reason for backing up my catalog using binary logging is so 
>> that I can perform a point-in-time recovery of the catalog, should I loose 
>> it.
>> 
> 
> 
> I am not running a catalog backup in that way, but have thought about it.
> 
> You're correct that the batch tables are temporary tables created so that 
> jobs can do batch inserts of the file attributes.
> 
> I did run into a similar problem to yours when I had a MySQL slave server out 
> of sync with the master.  The slave (much like your restore) was reading 
> through binlogs to catch up and ran into a line that referred to a batch 
> table, which didn't exist.  In my case, it didn't exist because the slave 
> never saw an earlier line that created the temporary batch table.
> 
> I would imagine something similar is going on with your restore, where you 
> are not actually applying all the changes since the Full dump (or did not 
> capture all the changes since the Full dump), because somewhere you should 
> have a line in your binlogs that create the batch table before other lines 
> refer to and try to use it.
> 
> Also, keep in mind that theses temporary batch tables are owned by threads, 
> so if you start looking through your binlogs, you'll see many references to 
> bacula.batch, but they are not all referring to the same table.  Each thread 
> is able to have it's own bacula.batch table.
> 
> 
> Stephen
> 
> 
>> Any input anyone can offer would be greatly appreciated.
>> 
>> Thanks,
>> 
>> Joe
> 
> 
> -- 
> Stephen Thompson               Berkeley Seismological Laboratory
> stephen AT seismo.berkeley DOT edu    215 McCone Hall # 4760
> 404.538.7077 (phone)           University of California, Berkeley
> 510.643.5811 (fax)             Berkeley, CA 94720-4760

Hi Stephen,

Thank you very much for your reply.

I agree that it seems the creation of the batch table is not being captured, 
for some reason.

As I think it may be useful, here's the line taken from my MySQL 'RunBeforeJob' 
script when the full backup is taken:

        mysqldump --all-databases --single-transaction --delete-master-logs 
--flush-logs --master-data --opt -u ${DBUSER} -p${DBPASS} > 
${DST}/${HOST}_${DATE}_${TIME}.sql.dmp

Can you spot anything there which could cause the creation of this/these 
temporary tables to not be included in the bin log? I've spent a while getting 
this list of options right and I'm not 100% sure I've got the correct 
combination, but it's possible I've missed something here.

Thanks,

Joe
------------------------------------------------------------------------------
For Developers, A Lot Can Happen In A Second.
Boundary is the first to Know...and Tell You.
Monitor Your Applications in Ultra-Fine Resolution. Try it FREE!
http://p.sf.net/sfu/Boundary-d2dvs2
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users