Bacula-users

Re: [Bacula-users] Backup to remote location

2010-06-28 21:35:01
Subject: Re: [Bacula-users] Backup to remote location
From: Daniel beas <beasdaniel AT hotmail DOT com>
To: <massimiliano AT perantoni DOT net>, <bacula-users AT lists.sourceforge DOT net>
Date: Tue, 29 Jun 2010 01:27:55 +0000
Yes it is, im just probing this config but you can add as many devices 
in your bacula-sd as you want.

I don't know if it is a best practice, may be someone else can help us,
considering we do not backup on tape.

Daniel Beas Enriquez




Well, thinking to it better it may just solve the naming of the
volumes, not the location, as the Storage daemon would be always the
same, unless I can decide even the storage daemon for full and
incremental in a different way...

Do you know if it's possible?

2010/6/27 Massimiliano Perantoni <massimiliano AT perantoni DOT net>:
> Got it!
> If it works it is just a sample config on the website...
> I can differentiate the pools for the kind of job...
>        Full Backup Pool = FullTest
>        Incremental Backup Pool = AllIncrementals
>        Differential Backup Pool = AllDifferentials
>
> If it works I may define a local pool for the full backup that then I
> can copy elsewhere by hand, and the other pools as a different pool
> that will be local.
> That should solve the problem, as I would backup locally the files for
> the full backup, then move the files to the other site for disaster
> recovery, then move the rest by wire everyday.
>
> Ciao Massimiliano
>
> 2010/6/27 Daniel beas <beasdaniel AT hotmail DOT com>:
>> This works for me:
>> define a Job without specify a level
>>
>> Job {
>>    Name = diseno
>>    Type = Backup
>>    WriteBootstrap = /home/whatever.bsr
>>    Client = cdiseno
>>    FileSet = dis
>>    Pool = pdis
>>    Schedule = Weeklydis
>>    Storage = diseno
>>    RunBeforeJob = /home/whatever/creavolsdis
>> }
>> # And then define the Schedule this way
>>
>> Schedule {
>>    Name = Weeklydis
>>    Run = Full sat at 16:00
>>    Run = Incremental mon-fri at 12:00
>> }
>>
>> I'm not so sure about the right sintax at this moment but this backup is
>> made on site, the script creates a volume every day and it does the
>> incremental backups in the week (about 1.5 gb mon-fri and 36 Gb on weekend).
>> This because it is the same Job with 2 Schedule (only one schedule
>> resource).
>>
>> I hope it is what you are looking for
>>
>> Daniel Beas Enriquez
>>
>>
>>
>>
>>> Date: Sat, 26 Jun 2010 23:45:15 +0200
>>> Subject: Re: [Bacula-users] Backup to remote location
>>> From: massimiliano AT perantoni DOT net
>>> To: beasdaniel AT hotmail DOT com
>>>
>>> Hi thanks for the quick answer to all! Reading the answer I understand
>>> that maybe I did not explain the situation well.
>>>
>>> My actual problem is that, for disaster recovery, the limitation is
>>> bandwidth, not time. Actually the real problem is that, while I should
>>> recover the failure rebuying or (in a better condition) reinstalling
>>> all the servers during a complete failure of the systems, that's
>>> acceptable for me to bring back all data from a remote location to
>>> start working again after that I have put back in place all the
>>> infrastructure: rebuying minimal sistems would take weeks, while
>>> copying data would cost a day. The only matter is transfering data
>>> from A to B. On B, infact, we have plenty of space to house the full
>>> and the incremental backups (actually the data is 4TB and growing,
>>> while the storage area is something like 12 TB).
>>>
>>> As of now the only problem is the unacceptable time to copy the full
>>> backup from A to B; just a short calculation, lets suppose 10Mbps (the
>>> full bandwidth between A and B), to copy 4 TB of data wold take
>>> approximatively
>>> 4 * 1024 * 1024 * 8 /( 10 * 3600) = 1342 hours!!!!
>>> To transfer the full backup. It would mean that, to complete the full
>>> backup job, it would take something like 55 days... That would make
>>> the internet line unusable...
>>>
>>> The solution would be making a full backup onsite in A, then send it
>>> to site B for storage, then work using incremental. My schedule would
>>> be
>>>
>>> 1 Full per year (of every 6 months) in local, then moved someway to Site B
>>> 2 Differential every week remotely
>>> 3 Incremental daily remotely
>>> The only problem, actually, is that if I create a job, it has its
>>> name, sd and fd; when I change them the backup is different...
>>>
>>> Doug Forster has written some schedules that seem to be interesting...
>>> Actually my default schedule says that the job runs everyday, does an
>>> incremental every day, a differential per week and a full per month;
>>> if the incremental does not find a full it starts a full. Having 2
>>> schedules would mean having 2 different jobs, not solving my problem,
>>> is it right?
>>>
>>> Ciao Massimiliano
>>>
>>>
>>> 2010/6/26 Daniel beas <beasdaniel AT hotmail DOT com>:
>>> > hi massimiliano
>>> >
>>> > in my little experience you can define the job without especify full or
>>> > Incremental backup and in the schedule you define schedule for full and
>>> > for
>>> > incremental backups.
>>> > I'm so new to bacula, may be someone can tell you a better solution
>>> >
>>> > Mejor loado de los pocos sabios que aplaudidode los muchos necios
>>> > Miguel de Cervantes Saavedra
>>> >
>>> > Daniel Beas Enriquez
>>> >
>>> >
>>> >
>>> >
>>> >> Date: Fri, 25 Jun 2010 23:39:45 +0200
>>> >> From: massimiliano AT perantoni DOT net
>>> >> To: bacula-users AT lists.sourceforge DOT net
>>> >> Subject: [Bacula-users] Backup to remote location
>>> >>
>>> >> Hi!
>>> >> I'm going to plan a remote backup between two places in my company
>>> >> that are connected with 10Mbps connectivity. I need to backup, to be
>>> >> sure I will not have problems for disaster recovery, all the data in
>>> >> "location A" to "location B". Up to the moment we have a local backup
>>> >> working like a charm with bacula, the matter is the remote backup; we
>>> >> have 4TB of data and growing and we calculated that we would spend a
>>> >> lot of our time and bandwidth doing the backup, so we would like to
>>> >> create a first full backup of "location A" (maybe yearly) to be sent
>>> >> to "location B", where we would just do the incremental ones: the
>>> >> great matter is that we would like to do it as automatically as
>>> >> possible: the matter is that for what I understood up till now is that
>>> >> bacula works on job names, not on backupped files, so that if I backup
>>> >> everything with a job, a different job on the same fileset would start
>>> >> with a full backup.
>>> >>
>>> >> How to solve this problem?
>>> >>
>>> >> Any help would be really appreciated...
>>> >>
>>> >> Ciao Massimiliano
>>> >>
>>> >>
>>> >>
>>> >> ------------------------------------------------------------------------------
>>> >> ThinkGeek and WIRED's GeekDad team up for the Ultimate
>>> >> GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the
>>> >> lucky parental unit. See the prize list and enter to win:
>>> >> http://p.sf.net/sfu/thinkgeek-promo
>>> >> _______________________________________________
>>> >> Bacula-users mailing list
>>> >> Bacula-users AT lists.sourceforge DOT net
>>> >> https://lists.sourceforge.net/lists/listinfo/bacula-users


Soy como quiero ser en mi Messenger
------------------------------------------------------------------------------
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
<Prev in Thread] Current Thread [Next in Thread>