Hi Ana,
Thanks again for the help!
Yes, the database is large. The File table has 370+ million records and Bacula is pruning. I'll see what we can do with mysqltuner. I have a feeling I'm occasionally getting the timeout error because there are so many records. If I can't find a cure for the situation, I'm thinking of splitting up our backups using two catalog servers -- one for production and one for test/dev. Do you think this is wise? Currently, our DBAs don't support Postgres so it may not be an option. Would Oracle be an option instead?
I'll have to check on the "--enable-batch-insert" option. Is there a Bacula command to see what options Bacula was built with? I did not set up our installation and am new to Bacula.
Would you have any idea why a new duplicate job is starting when one was rescheduled? I looked over the configs I know about, but could not find an option that starts a new job given any situation. At least the job will have two more chances to run if I can prevent the new dup. job from starting when the lock is detected. I know that won't cure the database problem, but I have a feeling the jobs might complete successfully on the rescheduled tries (thinking the lock will be gone by then) until the DB can be tuned.
Warmest regards,
-craig