Hello.
In Bacula 2.x I used to use my own script that I ran in dir before every job.
The script would query dir whether the job is already ran (the job may be
still waiting for smth or running) and if this is the case, the script would
return 1 and cancel the job.
I hoped that would be the case with "Allow Duplicate Jobs = no" in 3.0 (I'm
using it on FreeBSD), but it seems the directive doesn't change anything, at
least in my case:
1) I ran a job with level=Full and
2) started another instance of the same job with level=Incremental
3) I saw the 2nd job's level upgraded to Full ("No prior or suitable Full
backup found in catalog") and then
4) the 2nd job started waiting for the storage (as it uses the same device as
the 1st job, and a device may have only 1 job using it, even if the device is
HDD).
5) After the 1st job completed running, the 2nd started running.
This is exactly the same scenario I wrote my "cancelling duplicate jobs
script" for. I don't understand what's the point of the new duplicate job
control (because I first thought it was to resolve the problem described
above). Or doesn't it just work?
Anyone having success with it?
--
Silver
------------------------------------------------------------------------------
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensign option that enables unlimited
royalty-free distribution of the report engine for externally facing
server and web deployment.
http://p.sf.net/sfu/businessobjects
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
|