Bacula-users

[Bacula-users] Problems doing concurrent jobs, and having lousy performance

2011-09-24 18:54:17
Subject: [Bacula-users] Problems doing concurrent jobs, and having lousy performance
From: Boudewijn Ector <boudewijn AT boudewijnector DOT nl>
To: bacula-users AT lists.sourceforge DOT net
Date: Sun, 25 Sep 2011 00:32:57 +0200
Hi  Guys,


For some time, I've been trying to get concurrent jobs in bacula to work.
For doing so, I've created a pool for each client, and made sure all 
parts of the setup have got the max concurrent jobs = >1 .

Please allow  me to elaborate about my configuration:

This is part of my bacula-dir (well, this is a file for a client 'www', 
and it's being included in bacula-dir, along with some exactly the same 
files except for passwords/hostnames):

JobDefs {
   Name = "www-weekly"
   Type = Backup
   Level = Incremental
   Client = www
   FileSet = "Full Set"
   Schedule = "WeeklyCycle"
   Storage = leiden-filestorage
   Messages = Standard
   Pool = wwwPool
   Priority = 10
}



Job {
   Name = "wwwjob"
   JobDefs = "www-weekly"
   Write Bootstrap = "/var/lib/bacula/www.bsr"
}

Client {
   Name = www
   Address = www.KNIP
   FDPort = 9102
   Catalog = MyCatalog
   Password = "KNIP"          # password for FileDaemon
   File Retention = 30 days            # 30 days
   Job Retention = 6 months            # six months
   AutoPrune = yes                     # Prune expired Jobs/Files
}


Pool {
   Name = wwwPool
   LabelFormat = "wwwVol"
   Pool Type = Backup
   Recycle = yes                       # Bacula can automatically 
recycle Volumes
   AutoPrune = yes                     # Prune expired volumes
   Volume Retention = 365 days         # one year
   Volume Use Duration = 23h
}



As you can see, I've removed some sensitive information. A clone of this 
config is also used for 'mail', and some more machines.  Each has it's 
own pool (because of concurrency).


Well the bacula-sd.conf:

Storage {                             # definition of myself
   Name = leiden-filestorage
   WorkingDirectory = "/var/lib/bacula"
   Pid Directory = "/var/run/bacula"
   Maximum Concurrent Jobs = 50
   SDAddresses = {
         ip = { addr = 192.168.1.44; port = 9103 }
         ip = { addr = 127.0.0.1; port =9103 }
   }
}
Director {
   Name = leiden-dir
   Password = "*"
}
Director {
   Name = leiden-mon
   Password = "*"
   Monitor = yes
}
Device {
   Name = leiden-filestorage
   Media Type = File
   Archive Device = /bacula
   LabelMedia = yes;                   # lets Bacula label unlabeled media
   Random Access = Yes;
   AutomaticMount = yes;               # when device opened, read it
   RemovableMedia = no;
}

Messages {
   Name = Standard
   director = leiden-dir = all
}





Pretty standard, should I change something in here?



And my bacula-fd.conf:

Director {
   Name = leiden-dir
   Password = "*"
}

Director {
   Name = www.*-mon
   Password = "*"
   Monitor = yes
}

FileDaemon {                          # this is me
   Name = www.*-fd
   FDport = 9102                  # where we listen for the director
   WorkingDirectory = /var/lib/bacula
   Pid Directory = /var/run/bacula
   HeartBeat Interval = 15
   Maximum Concurrent Jobs = 20
   FDAddress = *
}

Messages {
   Name = Standard
   director = www.*-dir = all, !skipped, !restored
}
Also quite boring.




Can someone please explain to me why bacula still is not able to run 
concurrent Jobs? Do I have to create a storage for each client (for 
instance)? And what's the reason for having to do so?


Furthermore, I've enabled the compression on some clients, but 
nevertheless the system's performance isn't very good. It tends to 
stagger at about 1800kb/s , but both ends of the line are 100mbit... and 
almost not being used at all.
The director and sd are on the same machine, attached to a NAS (which 
performs fine by itself), and the machine has a dual-core Atom CPU 
running debian and 2gb of RAM. It also has no other jobs except for 
Nagios (which is not very heavily loaded).


Cheers,

Boudewijn Ector

------------------------------------------------------------------------------
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security
threats, fraudulent activity, and more. Splunk takes this data and makes
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2dcopy2
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users

<Prev in Thread] Current Thread [Next in Thread>