Bacula-users

[Bacula-users] Tuning for large (millions of files) backups?

2010-10-07 17:06:03
Subject: [Bacula-users] Tuning for large (millions of files) backups?
From: Mingus Dew <shon.stephens AT gmail DOT com>
To: bacula-users <bacula-users AT lists.sourceforge DOT net>
Date: Thu, 7 Oct 2010 17:03:41 -0400
All,
     I am running Bacula 5.0.1 on Solaris 10 x86. I'm currently running MySQL 4.1.22 for the database server. I do plan on upgrading to a compatible version of MySQL 5, but migrating to PostgreSQL isn't an option at this time.

     I am trying to backup to tape a very large number of files for a client. While the data size is manageable at around 2TB, the number of files is incredibly large.
The first of the jobs had 27 million files and initially failed because the batch table became "Full". I changed the myisam_data_pointer size to a value of 6 in the config.
This job was then able to run successfully and did not take too long.

    I have another job which has 42 million files. I'm not sure what that equates to in rows that need to be inserted, but I can say that I've not been
able to successfully run the job, as it seems to hang for over 30 hours in a "Dir inserting attributes" status. This causes other jobs to backup in the queue and
once canceled I have to restart Bacula.

    I'm looking for way to boost performance of MySQL or Bacula (or both) to get this job completed.

Thanks,
Shon
------------------------------------------------------------------------------
Beautiful is writing same markup. Internet Explorer 9 supports
standards for HTML5, CSS3, SVG 1.1,  ECMAScript5, and DOM L2 & L3.
Spend less time writing and  rewriting code and more time creating great
experiences on the web. Be a part of the beta today.
http://p.sf.net/sfu/beautyoftheweb
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users