Bacula-users

Re: [Bacula-users] Filling Database Table - very slow

2011-10-13 09:37:55
Subject: Re: [Bacula-users] Filling Database Table - very slow
From: "Jarrod Holder" <jholder AT catoosa.k12.ga DOT us>
To: <bacula-users AT lists.sourceforge DOT net>
Date: Thu, 13 Oct 2011 09:35:52 -0400
Thanks for the info guys.  I decided to start over from scratch with the server (running Suse Server 11 BTW).
 
Batch insert was NOT on before (it is now) and I used the InnoDB specs below.  The restore process is 1000 times faster now.  THANK YOU!
 
Version browser is still a bit slow if picking an entire directory, but I can't really see using that to restore an entire structure anyway.  Just a few files.
 
So again, thanks for your help!  :)
 


>>> Brian Debelius <bdebelius AT intelesyscorp DOT com> 10/11/2011 11:09 AM >>>
Hi,

I have a 5GB database.  The server has 6GB RAM.  These are the settings I am using right now.

default-storage-engine=innodb
default-table-type=innodb
query_cache_limit=16M
query_cache_size=256M
innodb_log_file_size=384M
innodb_buffer_pool_size=3G
innodb_log_buffer_size=2M
innodb_flush_log_at_trx_commit=2

Your mileage may vary,
Brian-


On 10/11/2011 8:04 AM, Jarrod Holder wrote:
Bacula version 5.0.3
 
In BAT, when trying to restore a directory (roughly 31,000 files in 560 sub folders)  The "Filling Database Table" takes an extremely long time to complete (about an hour or so).
 
I've been looking around for a way to speed this up.  Found a post on here that referred to an article that basically said PostgreSQL was the way to go as far as speed (http://wiki.bacula.org/doku.php?id=faq#restore_takes_a_long_time_to_retrieve_sql_results_from_mysql_catalog).  So I converted from MySQL to PostgreSQL using the conversion procedure in the Bacula documentation.  We are now on PostgreSQL, but the speed seems just as slow (if not slower).  Is there anything else that can be done to speed this process up?
 
I've also tried the running the DB under MySQL with MyISAM and InnoDB tables.  Both had the same slow performance here.  With MySQL, I also tried using the my-large.cnf and my-huge.cnf files.  Neither helped.
 
Server load is very low during this process (0.06).  BAT process is at about 3% cpu and 1.6% memory.  Postgres service is about 1%cpu, 0.6% memory.  Drive array is pretty quiet also.
 
Any help would be greatly appreciated.  If any extra info is needed, I will gladly provide it.


------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2d-oct


_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2d-oct
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
<Prev in Thread] Current Thread [Next in Thread>