On Mon, Apr 02, 2012 at 08:20:17PM +0200, Dennis Hoppe wrote:
> Hello,
>
> is it possible to use chained copy jobs? For example i would like to
> copy my full backups from local disk to usb disk and after that to an
> nas storage.
>
Hi Dennis,
I see your definition below is lacking the "SQLQuery" for the
"Selection Type", might this be part of the problem?
All the best,
Uwe
> Job {
> Name = "backup-all"
> JobDefs = "DefaultBackup"
> Client = backup-fd
> FileSet = "backup-all"
> Storage = backup
> Full Backup Pool = backup-monthly
> Incremental Backup Pool = backup-daily
> Differential Backup Pool = backup-weekly
> }
>
> Job {
> Name = "backup-copy-monthly-usb"
> JobDefs = "DefaultCopy"
> Client = backup-fd
> FileSet = "backup-all"
> Schedule = "MonthlyCopy"
> Storage = backup
> Pool = backup-monthly
> Selection Pattern = "
> SELECT max(jobid)
> FROM job
> WHERE name = 'backup-all'
> AND type = 'B'
> AND level = 'F'
> AND jobstatus = 'T';"
> }
>
> Job {
> Name = "backup-copy-monthly-nas"
> JobDefs = "DefaultCopy"
> Client = backup-fd
> FileSet = "backup-all"
> Schedule = "MonthlyCopy2"
> Storage = backup
> Pool = backup-monthly
> Selection Pattern = "
> SELECT max(jobid)
> FROM job
> WHERE name = 'backup-copy-monthly-usb'
> AND type = 'c'
> AND level = 'F'
> AND jobstatus = 'T';"
> }
>
> Pool {
> Name = backup-monthly
> Pool Type = Backup
> Recycle = yes
> RecyclePool = scratch
> AutoPrune = yes
> ActionOnPurge = Truncate
> Volume Retention = 2 months
> Volume Use Duration = 23 hours
> LabelFormat =
> "backup-full_${Year}-${Month:p/2/0/r}-${Day:p/2/0/r}-${Hour:p/2/0/r}:${Minute:p/2/0/r}"
> Next Pool = backup-monthly-usb
> }
>
> Pool {
> Name = backup-monthly-usb
> Storage = backup-usb
> Pool Type = Backup
> Recycle = yes
> RecyclePool = scratch
> AutoPrune = yes
> ActionOnPurge = Truncate
> Volume Retention = 2 months
> Volume Use Duration = 23 hours
> LabelFormat =
> "backup-full_${Year}-${Month:p/2/0/r}-${Day:p/2/0/r}-${Hour:p/2/0/r}:${Minute:p/2/0/r}"
> Next Pool = backup-monthly-nas
> }
>
> Pool {
> Name = backup-daily-nas
> Storage = backup-nas
> Pool Type = Backup
> Recycle = yes
> RecyclePool = scratch
> AutoPrune = yes
> ActionOnPurge = Truncate
> Volume Retention = 7 days
> Volume Use Duration = 23 hours
> LabelFormat =
> "backup-incr_${Year}-${Month:p/2/0/r}-${Day:p/2/0/r}-${Hour:p/2/0/r}:${Minute:p/2/0/r}"
> }
>
> If i run the sql statement from "backup-copy-monthly-nas" by hand, the
> correct jobid is selected which should get the "read storage", "write
> storage" and "next pool" from the job "backup-copy-monthly-usb".
> Unfortunatly, bacula ignores the sql statement and is getting the jobid
> from "backup-all" which will end in an duplicate copy at the storage
> "backup-usb". :(
>
> Did i something wrong or is not bacula able to use two different "next
> pools" / "storages"?
>
> Regards, Dennis
>
> ------------------------------------------------------------------------------
> This SF email is sponsosred by:
> Try Windows Azure free for 90 days Click Here
> http://p.sf.net/sfu/sfd2d-msazure
> _______________________________________________
> Bacula-users mailing list
> Bacula-users AT lists.sourceforge DOT net
> https://lists.sourceforge.net/lists/listinfo/bacula-users
--
uwe.schuerkamp AT nionex DOT net fon: [+49] 5242.91 - 4740, fax:-69 72
Hauptsitz: Avenwedder Str. 55, D-33311 Gütersloh, Germany
Registergericht Gütersloh HRB 4196, Geschäftsführer: H. Gosewehr, D. Suda
NIONEX --- Ein Unternehmen der Bertelsmann AG
------------------------------------------------------------------------------
Better than sec? Nothing is better than sec when it comes to
monitoring Big Data applications. Try Boundary one-second
resolution app monitoring today. Free.
http://p.sf.net/sfu/Boundary-dev2dev
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
|