Bacula-users

Re: [Bacula-users] Remote Storage on NAS

2017-01-06 09:03:12
Subject: Re: [Bacula-users] Remote Storage on NAS
From: Josip Deanovic <djosip+news AT linuxpages DOT net>
To: bacula-users AT lists.sourceforge DOT net
Date: Fri, 06 Jan 2017 15:01:52 +0100
On Friday 2017-01-06 10:27:54 christopher.pecriaux AT alkern DOT fr wrote:
> Hello,

Hi Christopher.

> Sorry for my English, I use google translation.

No problem, I think that the message is quite understandable.

> My server runs on ubuntu 15.10. And my clients are on Windows 7. And I
> would like to store my backups on a NAS.
> 
> My server IP is 192.168.1.91, my NAS IP is 192.168.7.20 and my client IP
> is 192.168.7.100 I would like to store the backup on the NAS without
> going to the server because the internet rate is low.
> 
> 
> Currently I have mounted a share with Samba but to save 100MB its lasts
> 7 minutes. So if I do the transfer directly to the NAS, its lasts 2
> minutes.
> 
> In the file "bacula-sd.conf" if I try to put :
> Storage {
>   Name = nas.7.20-sd
>   SDPort = 9103
>   WorkingDirectory = "/var/lib/bacula"
>   Pid Directory = "/var/run/bacula"
>   Maximum Concurrent Jobs = 20
>   SDAddress = 192.168.7.20
> }
> 
> it does not work (Connection refused), It does not have a firewall And I
> added the write/read rights to Everyone.

It doesn't work that way.

Bacula director daemon, bacula storage daemon and bacula catalog
(the database) could all either reside on the same server or they
could reside on dedicated servers (depending on the needs).

To simplify the example let's say that all these daemons reside on
the same server which is probably the most common bacula setup.

Let's say that you have some production server that need to be backed up.
That production server will have to run bacula file daemon (backup agent
that actually reads the files on the server that need to be backed up).

Once you configure all the bacula services and you run the backup job
that is configured to backup the production server, the bacula director
daemon will contact the storage daemon and the file daemon in order to
exchange some kind of session keys and after that bacula file daemon
on the production server will connect to the storage daemon that resides
on the backup server and the data stream will start.
This is the default behavior and the alternative behavior would not
change anything in your case so I am not going to complicate things.

If you wish to use your NAS to store the backed up data retrieved from
the production server you would need to mount the specific file system
from your NAS to a specific mountpoint on the backup server and configure
the bacula storage daemon on the backup accordingly.

You cannot configure bacula storage daemon or the bacula client to
directly store the data to the NAS.

You could eventually setup the additional dedicated server with bacula
storage daemon in the same network with the NAS and mount the file system
from NAS there. In that case your production server would send the stream
of data without leaving your 192.168.7.X network and you would get the
desired results.

It is also possible to setup the bacula storage daemon on the production
server and mount the file system from NAS there but this idea is so bad
from the security and the data safety standpoint that it would probably
be better that I didn't even mention it.

Also, check if your NAS is capable to export NFS or iSCSI as both are most
likely faster than Samba.


Regards

-- 
Josip Deanovic

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most 
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users

<Prev in Thread] Current Thread [Next in Thread>