Bacula-users

Re: [Bacula-users] bacula splitting big jobs in to two

2015-02-10 19:12:35
Subject: Re: [Bacula-users] bacula splitting big jobs in to two
From: Blake Dunlap <ikiris AT gmail DOT com>
To: Bryn Hughes <linux AT nashira DOT ca>
Date: Tue, 10 Feb 2015 16:06:37 -0800
Umm... just a check, but if it takes this long to back up, wont it
take just as long if not longer to restore?

I don't really see how this is a workable situation and perhaps you're
trying to solve the wrong problem?

-Blake

On Mon, Feb 9, 2015 at 2:01 PM, Bryn Hughes <linux AT nashira DOT ca> wrote:
> I'm in a somewhat similar boat, I don't have 70TB, but I do have 21.5TB
> and LTO3 equipment (max 80 MB/sec).  If it were possible to keep a tape
> drive streaming constantly without ever having to change tapes or
> unloading it would still take me many days to finish.
>
> The easiest way to start splitting things up is at the root filesystem
> level for your data directory.  I assume you don't have 70TB in files
> just all thrown together in one big directory, there is probably some
> sort of organizational structure?
>
> In my case we generate a new directory at the root level each year and
> then put the year's work in to that.  I create a separate fileset for
> each individual year.  I then have a separate job for each year.  Using
> JobDefs and a common schedule minimizes the configuration work, you end
> up with nothing more really than a Job and a Fileset, each with only a
> few lines in your config file.
>
> Look for something like this in how your data is laid out.  Just
> remember to create additional jobs as new stuff is added - again in my
> case we're only doing this once a year.  I have the root directory of
> the file shares locked so new folders can't be created by anyone but the
> admins, this allows me to ensure the correct backup config is added at
> the same time.
>
> Bryn
>
>
> On 2015-02-02 06:23 AM, Rao, Uthra R. (GSFC-672.0)[ADNET SYSTEMS INC] wrote:
>> Andy,
>>
>> I have to backup about 70TB in one job.
>>
>> Uthra
>>
>> -----Original Message-----
>> From: akent04 [mailto:bacula-forum AT backupcentral DOT com]
>> Sent: Friday, January 30, 2015 9:50 AM
>> To: bacula-users AT lists.sourceforge DOT net
>> Subject: [Bacula-users] bacula splitting big jobs in to two
>>
>> I run bacula 5.2.12 on a RHEL server which is attached to a Tape Library. I 
>> have two LTO5 tape drives. Since the data on one of my server has grown big 
>> the back-up takes 10-12 days to complete. I would like to split this job in 
>> to two jobs. Has anybody done this kind of a set-up? I need some guidance on 
>> how to go about it."
>>
>>
>> "Mr. Uthra: at a glance I think the only way to do that is having to Job 
>> configurations for the same Client, with two FileSet configurations for them 
>> respectively.
>> This is very manual, since you will have to "load balance" the different 
>> backup paths for each FileSet by yourself.
>> Just in time: unless there is a huge amount of information on this server 
>> it's not normal for a Backup Job take 12 days to complete, and maybe there 
>> is some bottlenecks in your structure / configuration. "
>>
>>
>> Regards,
>> ==============================================================================Heitor
>>  Medrado de Faria - LPIC-III | ITIL-F Jan. 26 - Fev. 06 - Novo Treinamento 
>> Telepresencial Bacula: http://www.bacula.com.br/?p=2174
>> 61  (tel:%2B55%2061%202021-8260)8268-4220 (tel:%2B55%2061%208268-4220)
>> Site: heitorfaria < at > gmail.com (heitorfaria < at > gmail.com) 
>> ===============================================================================
>>
>> I would have to echo the above post in regards to bottlenecks. How much data 
>> are you backing up?
>> I'm able to back up almost 500GB(over a million files) in around 4-5 hours 
>> and that is to an RDX cartridge in a Tandberg Quikstation. Twice that 
>> (around 1TB) would probably only take about 8-10 hours, less than a day. If 
>> you have only around 1-4TB of data you're backing up 10-12 days is abnormal 
>> for that when dealing with some type of local hardware(not over Internet or 
>> VPN), and I'd look into why it's taking so long instead of trying to work 
>> around it.
>>
>> -Andy
>>
>
>
> ------------------------------------------------------------------------------
> Dive into the World of Parallel Programming. The Go Parallel Website,
> sponsored by Intel and developed in partnership with Slashdot Media, is your
> hub for all things parallel software development, from weekly thought
> leadership blogs to news, videos, case studies, tutorials and more. Take a
> look and join the conversation now. http://goparallel.sourceforge.net/
> _______________________________________________
> Bacula-users mailing list
> Bacula-users AT lists.sourceforge DOT net
> https://lists.sourceforge.net/lists/listinfo/bacula-users

------------------------------------------------------------------------------
Dive into the World of Parallel Programming. The Go Parallel Website,
sponsored by Intel and developed in partnership with Slashdot Media, is your
hub for all things parallel software development, from weekly thought
leadership blogs to news, videos, case studies, tutorials and more. Take a
look and join the conversation now. http://goparallel.sourceforge.net/
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users

<Prev in Thread] Current Thread [Next in Thread>