Bacula-users

[Bacula-users] Maximum Concurrent Jobs

2009-06-19 18:34:57
Subject: [Bacula-users] Maximum Concurrent Jobs
From: Dirk Bartley <bartleyd2 AT chartermi DOT net>
To: bacula-users <bacula-users AT lists.sourceforge DOT net>
Date: Fri, 19 Jun 2009 18:29:15 -0400
Greetings

I'm trying to understand Maximum Concurrent Jobs.

Here is an example output from status dir



Running Jobs:
Console connected at 19-Jun-09 17:51
 JobId Level   Name                       Status
======================================================================
  9334 Full    Alum3JobMain.2009-06-19_17.01.00_47 is running
  9336 Full    Alum2RootJob.2009-06-19_17.01.00_49 is waiting on Storage
Alum2FileStorage
  9337 Full    Alum1Job.2009-06-19_17.01.00_50 is waiting on Storage
Alum2FileStorage
  9338 Full    Alum3root.2009-06-19_17.01.00_51 is waiting on Storage
Alum2FileStorage
  9339 Full    Alum3sharedlvm.2009-06-19_17.01.00_52 is waiting on
Storage Alum2FileStorage
  9340 Full    DatabaseJob.2009-06-19_17.01.00_53 is waiting execution
  9341 Full    SubversionJob.2009-06-19_17.01.00_54 is waiting for
higher priority jobs to finish
====


These jobs are all being written to filestorage.  Then when they are all
done the low priority jobs will trigger a run after script that will
start them all being copied to tape, but that is later.

So it looks like only one job is running because I have a maximum
concurrent directive relating to storage limiting the number of jobs
running concurrently.

Reading the storage daemons config documentation, there is only one and
it is on the storage daemon itself, not on the Device resource.  It is
set to 50.

On this storage daemon I have 2 tape devices and one Filestorage device
on this storage Daemon.

I can see from the director configuration, there is a maximum concurrent
jobs directive for just about every resource.  The director resource is
set to 5.  All jobs are set to 5 with the exception of the copy jobs.
All client resources are set to 10.  Each storage resource in the
directors config is set to a Max of 10.

So I'm not understanding why I can't have more jobs running at the same
time.  The pool resource for the File-xxx pool sets Maximum Volume Jobs
= 1 so that each job will have it's own volume (or file on the hard
drive) that it is writing to.

Because I am writing to file first, I am not setting up spooling.

Any possible assistance in helping me understand this is apprecieated.
I guess I could be happy that everything is working, I'ts just that when
someone asks me why I can't get more jobs running concurrently, I want
to be able to answer.  Because I know one person will ask.

Dirk



------------------------------------------------------------------------------
Are you an open source citizen? Join us for the Open Source Bridge conference!
Portland, OR, June 17-19. Two days of sessions, one day of unconference: $250.
Need another reason to go? 24-hour hacker lounge. Register today!
http://ad.doubleclick.net/clk;215844324;13503038;v?http://opensourcebridge.org
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users

<Prev in Thread] Current Thread [Next in Thread>