Bacula-users

Re: [Bacula-users] bacula 3.0.3 maintain ctime on restored files?

2010-01-15 16:48:22
Subject: Re: [Bacula-users] bacula 3.0.3 maintain ctime on restored files?
From: Steve Costaras <stevecs AT chaven DOT com>
To: bacula-users AT lists.sourceforge DOT net
Date: Fri, 15 Jan 2010 15:45:35 -0600

Yes, unfortunately the directory names and the amount of data in them varies dramatically (image file processing for each project) so there is no real way to break it apart without a large chance of missing something.    Ideally I would like to have the SD multi-plex the files to different tape drives (i.e. get linear improvement with the number of drives) but haven't looked into it as it's only a problem with full backups/restores which /should/ be infrequent except now when we're having other problems.



On 01/15/2010 09:02, Bob Hetzel wrote:
  
In the case here it's rather a large issue (granted due to other 
problems like hardware or software issues that require restores of data 
and is NOT something that I want to continue, however we are living in 
an imperfect world) but to give you an idea the dataset size is about 
30-40TiB with restores anywhere from 2TiB to a full restore to back that 
back up again not only takes a LOT of tapes which is costly it also 
takes a LOT of time (day/weeks) where nothing else can be run.

What I have been doing which is just painful is do a full restore, then 
have to do a full backup right after then continue so a restore is 
actually the time of about 2x of a full.  (for a full set this is about 
9-10 days on LTO4).   This has happened about 3 times so far in the past 
2 months.
    
Have you considered breaking up such a big dataset into pieces that only 
take up perhaps 5 tapes or less each?  I know it can be a pain to manage, 
but it seems like that would go a long way toward.  The other benefit is 
that if a tape breaks during the restore (or is just discovered to be 
unreadable) you can still get 100% of the other datasets restored.  IMHO, 
any time you realize you can't back up your fileset in one day or even one 
weekend that's a good opportunity to take a step back and re-think things. 
  Perhaps you might even need to take the step of scheduling the fulls so 
they don't occur on the same schedule.

I realize you that your periodic audits of your filesets to make sure 
you're not missing something become a lot more complicated, but a painful 
audit once per quarter seems like a lot better than what you've described 
above.




------------------------------------------------------------------------------
Throughout its 18-year history, RSA Conference consistently attracts the
world's best and brightest in the field, creating opportunities for Conference
attendees to learn about information security's most important issues through
interactions with peers, luminaries and emerging and established companies.
http://p.sf.net/sfu/rsaconf-dev2dev
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users


  
------------------------------------------------------------------------------
Throughout its 18-year history, RSA Conference consistently attracts the
world's best and brightest in the field, creating opportunities for Conference
attendees to learn about information security's most important issues through
interactions with peers, luminaries and emerging and established companies.
http://p.sf.net/sfu/rsaconf-dev2dev
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
<Prev in Thread] Current Thread [Next in Thread>