Bacula-users

Re: [Bacula-users] Replicate bacula volumes some place else.

2016-08-15 19:38:14
Subject: Re: [Bacula-users] Replicate bacula volumes some place else.
From: Josip Deanovic <djosip+news AT linuxpages DOT net>
To: bacula-users AT lists.sourceforge DOT net
Date: Tue, 16 Aug 2016 00:37:53 +0200
On Monday 2016-08-15 12:15:51 Michael Munger wrote:
> I have Bacula backing up all the machines on the network to computer 1
> at site 1.
> 
> I would like to replicate the backup volumes to computer 2 at site2 (so
> the data is off-site in case of fire).
> 
> 
> There is a cable internet connection and VPN available between site 1
> and site 2 with decent speeds (sustained 500KBps on average. Example
> from rsync log: 2016/08/14 06:57:34 [16471] sent 43.54M bytes  received
> 18.88G bytes  586.76K bytes/sec)
> 
> The current strategy is to have all the volumes only a 5 GB in size and
> using rsync.
> 
> Is there a better way?


Hi Michael!

Whether there is a better or not depends on the other options and
resources available to you.

In my case replicating volumes with rsync would be a less optimal
solution as my volumes are 10 G in size and I didn't like an idea
of monitoring the modified volumes and then moving them around with
the rsync.

I wanted a safe and secure way of replicating backup and I in my
case it is based on the backup job and not on the modified volumes.

This is how I did it...
1. Remote server has an iSCSI server support and approximately the
   same amount of backup storage space as the backup server.
2. There is a secure tunnel created between the backup server and
   the remote server so that the communication with iSCSI doesn't use
   an unsecured communication channel.
3. On the backup server the remote iSCSI volume is used as an additional
   storage device (with its own Storage and Device sections).
   Approximately same number of volumes need to be created on that
   secondary storage device. In my case they both have the same
   Media Type (I don't remember any more if this was a requirement
   but I think that it was).
4. On the backup server for EVERY configured Backup job (or at least
   for the jobs you care to replicate to the remote server) there is
   one configured Copy job.
5. On the backup server there are two additional Admin jobs which
   take care about creating a secure tunnel with the remote server,
   proper mounting of iSCSI volume from the remote server, starting
   Copy jobs and unmounting iSCSI volume and closing the tunnel.
   If needed you can additionally secure your remote volume using
   LUKS (Linux Unified Key Setup) or you can chose to dismiss the
   encrypted tunnel completely since your data that travels over the
   net are in case of LUKS already encrypted.
   If your remote server is available over the Internet and if you
   are just a bit paranoid you might chose to use both, LUKS and
   the encrypted tunnel.
6. The nice thing that Copy jobs offer is the SQLQuery type of selection
   of the Backup jobs that still haven't been successfully copied to a
   remote server. This the place where you can use the Selection Pattern
   option to set a complex SQL query which will chose only specific
   jobs that need to be copied to a remote server.
7. Write the scripts the previously mentioned Admin jobs are going to
   execute and setup the Schedule for that Admin jobs.


I am using that setup over a year and it's working perfectly.
With this setup I have kept the replication of the backed up data
under Bacula's management and the replication is issued per Backup
job instead per modified volume (which is nice if your volumes
are big and your jobs are relatively small).

SQL query I am using makes sure that the replicated data (copied
jobs) are actually able to restore since not just the last job is
copied but also all the jobs from the same client since the last
full backup, including the last full backup that haven't been
copied yet.

This is especially useful if your data set is HUGE and instead of
coping all the volumes for weeks or even months making heavy impact on
the network. This way allows you to consistently copy (replicate) the
last Full, Differential and all the newest Incremental backup jobs that
haven't been copied yet and later continue from that point.

In case some of the Backup jobs fail to replicate they would be selected
during the next copy cycle. Appropriate e-mail messages are sent in any
case so that backup admin can monitor the situation and react if needed.

Over a year (probably longer) I just had to modify the script that mounts
iSCSI volume so that it first performs cleanup of the possible previous
failed mounts and then start connecting to the iSCSI following with
the starting Copy jobs from a manually maintained list.

Note that in my example I have mentioned iSCSI but other protocols could
be used as well. I was using iSCSI because it's cheap and it suited my
needs.

Also, since you are going to have a remote copy of the data I would
advise you to take some time and learn how to use and restore data
without the rest of the Bacula infrastructure (without a Catalog).
It might save a lot of time sometimes in the future.
It is also a good idea to regularly backup the Catalog database and
the whole Bacula configuration but it should go without saying.


Regards!

-- 
Josip Deanovic

------------------------------------------------------------------------------
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users

<Prev in Thread] Current Thread [Next in Thread>