Bacula-users

Re: [Bacula-users] seeking advice re. splitting up large backups -- dynamic filesets to prevent duplicate jobs and reduce backup time

2011-10-13 15:26:20
Subject: Re: [Bacula-users] seeking advice re. splitting up large backups -- dynamic filesets to prevent duplicate jobs and reduce backup time
From: Martin Simmons <martin AT lispworks DOT com>
To: bacula-users AT lists.sourceforge DOT net
Date: Thu, 13 Oct 2011 20:24:21 +0100
>>>>> On Wed, 12 Oct 2011 21:58:28 -0400, mark bergman said:
> 
> => If you limited the maximum jobs on the FD it would only run one at once,
> 
> That doesn't work, as we backup ~20 small machines in addition to the large (4
> to 8TB) filesystems.

Assuming you mean ~20 separate client machines (File Daemons), then you can
set Maximum Concurrent Jobs in the director's config for the large client.  In
fact, the default is 1, so it is surprising that you get concurrency unless
you've already increased it.

__Martin

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2d-oct
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users