Bacula-users

Re: [Bacula-users] How to handle long, time consuming backups

2008-12-31 10:22:13
Subject: Re: [Bacula-users] How to handle long, time consuming backups
From: Massimo Schenone <mschenone AT sorint DOT it>
To: Maarten Hoogveld <m.hoogveld AT elevate DOT nl>
Date: Wed, 31 Dec 2008 16:19:41 +0100
On Wed, 2008-12-31 at 11:35 +0100, Maarten Hoogveld wrote:
> First of al, thanks for all the suggestions to solve the problem!
> And thanks for a great product like Bacula!
Of course! I forgot to underline that bacula is "the" backup
swiss-knife ;) it lacks some features that commercial products have,
however the gap is going to be filled in a short period. 

You are right, network problems arise at random.. now I try to sniff all
the traffic from/to bacula/client ..

All I need is to accomplish only a full backup to get a starting point
for incremental ones .. the biggest problem is that when a backup fails
there is no entry in the File table associated to the failed job so I've
always to restart the job from the beginning..




> 
> ----
> 
> 2008/12/31 Massimo Schenone <mschenone AT sorint DOT it>
>         Hi all,
>         I have the same question: how to backup a remote samba server
>         with 60GB
>         of file system space?
>         I get an average throughput of 220Kb/s but the job always
>         fails with
>         network error (Connection reset). In the middle there is a
>         firewall and
>         I asked to people guys to raise tcp conn timeout..
> 
> If there's a network error, it seems to me that a flakey connection is
> the problem. Nothing you can really solve with bacula configuration
> changes. I;m not an expert, but it seems to me that, as long as data
> flows over the connection from the fd to the sd, the connection should
> not time out.
> I have had some experience myself with (ADSL) internet connections
> which loose line sync for a short moment a few times a day. This
> caused tcp connections to reset which would explain the errors you
> get. 
> If the network errors you get arise at random times after the start of
> the jobs, something like this might be what causes the problem. If the
> errors occur after a fixed amunt of time or transferred data. The
> problem might be something else.
> 
> 
>         
>         Splitting the job into 5 subjobs doesn't help.. after 2 or 3
>         hours the
>         connection is broken..
>         After going through docs and forums I tried to add spooling
>         and
>         concurrency, so that other jobs must not wait for the samba
>         one to
>         complete.. no way! I can't see any concurrency at all
>         Here it is the config
>         
>         Director {
>          ...
>          Maximum Concurrent Jobs = 4
>         }
>         
>         JobDefs {
>          ...
>          SpoolData = yes
>         }
>         ---
>         Storage {
>          ...
>          Maximum Concurrent Jobs = 20
>         }
>         
>         Device {
>          ...
>          Spool Directory = /data/spool;
>          Maximum Spool Size = 40 GB ;
>          Maximum Job Spool Size = 1 GB;
>          ...
>         }
>         
>         I get spooling but jobs don't run concurrently.. may because
>         jobs use
>         different pool? I also run interactively 3 jobs associated to
>         the same
>         pool..
>         
>         spooling itself is a serial process: mount tape, write data to
>         spool
>         until it reaches the max size, destage to tape, restart
>         writing to
>         spool...
>         I started using 10gb max size but I always get broken pipe
>         error
>         message, now with 1GB it's working (22 seconds to despool
>         45MB/s)
>         
>         I want to solve the problem in bacula way, not rsyncing data
>         between the
>         client and the server.. do you think setting up a file storage
>         daemon on
>         the client and then migrate volume to main storage is
>         reasonable?
>         Thanks a lot!
>         Massimo
>         
>         
>         
>         
>         
>         
>         In the middle there is a checkpoint firewall
>         
>         
>         On Tue, 2008-12-30 at 19:38 -0500, John Drescher wrote:
>         > > A large, mostly static data set, and low bandwidth
>         connection: this is
>         > > rsync's specialty!  If you are not familiar with it, look
>         here:
>         > >
>         > > http://samba.anu.edu.au/rsync/
>         > >
>         > > I suggest you use rsync to sync your Windows client to a
>         copy at your main
>         > > site (costs you 20GB of disk space somewhere) then do your
>         full Bacula
>         > > backup from there.
>         > >
>         > > There will no doubt be some issues with timing/scheduling,
>         and possibly
>         > > permissions, to work out.  And restores will be
>         two-stage.  But I think it
>         > > would be worth it.
>         > >
>         >
>         > I do that for a remote backup of a few GB and it works
>         great.
>         >
>         > John
>         >
>         
>         
>         >
>         
> ------------------------------------------------------------------------------
>         > _______________________________________________
>         > Bacula-users mailing list
>         > Bacula-users AT lists.sourceforge DOT net
>         > https://lists.sourceforge.net/lists/listinfo/bacula-users
>         
>         
>         
> ------------------------------------------------------------------------------
>         _______________________________________________
>         Bacula-users mailing list
>         Bacula-users AT lists.sourceforge DOT net
>         https://lists.sourceforge.net/lists/listinfo/bacula-users
>         
> 


------------------------------------------------------------------------------
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users