Bacula-users

Re: [Bacula-users] Filling Database Table - very slow

2011-10-11 09:08:44
Subject: Re: [Bacula-users] Filling Database Table - very slow
From: Christian Manal <moenoel AT informatik.uni-bremen DOT de>
To: bacula-users AT lists.sourceforge DOT net
Date: Tue, 11 Oct 2011 15:06:18 +0200
Am 11.10.2011 14:04, schrieb Jarrod Holder:
> Bacula version 5.0.3
>  
> In BAT, when trying to restore a directory (roughly 31,000 files in 560 sub 
> folders)  The "Filling Database Table" takes an extremely long time to 
> complete (about an hour or so).
>  
> I've been looking around for a way to speed this up.  Found a post on here 
> that referred to an article that basically said PostgreSQL was the way to go 
> as far as speed 
> (http://wiki.bacula.org/doku.php?id=faq#restore_takes_a_long_time_to_retrieve_sql_results_from_mysql_catalog).
>   So I converted from MySQL to PostgreSQL using the conversion procedure in 
> the Bacula documentation.  We are now on PostgreSQL, but the speed seems just 
> as slow (if not slower).  Is there anything else that can be done to speed 
> this process up?
>  
> I've also tried the running the DB under MySQL with MyISAM and InnoDB tables. 
>  Both had the same slow performance here.  With MySQL, I also tried using the 
> my-large.cnf and my-huge.cnf files.  Neither helped.
>  
> Server load is very low during this process (0.06).  BAT process is at about 
> 3% cpu and 1.6% memory.  Postgres service is about 1%cpu, 0.6% memory.  Drive 
> array is pretty quiet also.
>  
> Any help would be greatly appreciated.  If any extra info is needed, I will 
> gladly provide it.


Hi,

what OS are you running on? Did you built Bacula from the tarball? I had
a similar problem on Solaris 10, with the stock Postgres 8.3. Bacula's
'configure' didn't detect that Postgres was thread safe, so it omitted
"--enable-batch-insert".

Without batch-insert, a full backup of my biggest fileset took roughly
24 hours. The backup of the data itself was (and still is) only 4 to 5
hours, the rest was despooling attributes into the database (I only
noticed this when I enabled attribute spooling).

With batch-insert (had to hack around in the 'configure' script a
little), the time for attribute despooling shrunk down down to maybe 20
_minutes_. It helps *a lot*.


Regards,
Christian Manal

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2d-oct
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users

<Prev in Thread] Current Thread [Next in Thread>