Bacula-users

Re: [Bacula-users] maximum client file size

2015-05-22 02:00:09
Subject: Re: [Bacula-users] maximum client file size
From: Kern Sibbald <kern AT sibbald DOT com>
To: Devin Reade <gdr AT gno DOT org>, bacula-users AT lists.sourceforge DOT net
Date: Fri, 22 May 2015 07:55:19 +0200
Hello Deven,

On 21.05.2015 17:23, Devin Reade wrote:
> --On Thursday, May 21, 2015 09:06:41 AM +0200 Kern Sibbald 
> <kern AT sibbald DOT com> wrote:
>
>> Bacula does keep 64 bit addresses.
> Excellent.  Not surprisingly, I'm not dealing with file sizes near 2^63,
> but I *do* need to back up files that are in the 2^39 range (from
> filesystems that are in the 2^46 range onto virtual cartridges no
> larger than 2^43).  No, these aren't database files, they're huge
> chunks of write-once data for which we need archival copies.  I'm
> still debating whether Bacula is the right tool for the job in this
> case.  Network-based copies to geographically different locations
> is a non-starter, so it's got to be a variant of sneaker-net.
>
>> On the SD
>> output end, if you do not limit your Volume size, there will surely be
>> some problems at 2^63.  Of course, who would ever want to write such a
>> large volume?
> On that note, I've traditionally gone with volume sizes in the ~500MB (2^29)
> range (for disk stores), but in this case that can push the volume
> count in the catalog to more than 512k entries once a minimum number
> of offsite copies have been made.  Have you seen installations with that
> many volumes?  If so, are there any known issues other than catalog tuning?

I think you will probably want Volume sizes that are more like 50GB, and
possibly much larger.  Bacula is not designed to handle more than a few
thousand Volumes (handling more will come later), thus with 512K you may
see some performance problems.

>
> I'm thinking that a larger volume size (and consequently smaller
> volume count) could be warranted (at least for the full pool), but
> I'm wondering if there have been many that have passed volume
> sizes past 2GB or 4GB and if there have been any issues in doing so.

There are no issues except retention periods.  If you have lots of data
over say a week, I would not hesitate to have Volumes of several hundred GB.

>
> My gut is saying to go with 2GB volume sizes, but I'm curious.
Probably bigger unless your dataset is tiny.
>
> (Considering that my first hard drive cost me $4000 and was 40MB, all
> the above just sounds crazy.)
Yes, it is all crazy -- what is good is that for the most part, Bacula
has scaled rather gracefully.

Best regards,
Kern
>
> Devin
>
>
> ------------------------------------------------------------------------------
> One dashboard for servers and applications across Physical-Virtual-Cloud 
> Widest out-of-the-box monitoring support with 50+ applications
> Performance metrics, stats and reports that give you Actionable Insights
> Deep dive visibility with transaction tracing using APM Insight.
> http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
> _______________________________________________
> Bacula-users mailing list
> Bacula-users AT lists.sourceforge DOT net
> https://lists.sourceforge.net/lists/listinfo/bacula-users
>


------------------------------------------------------------------------------
One dashboard for servers and applications across Physical-Virtual-Cloud 
Widest out-of-the-box monitoring support with 50+ applications
Performance metrics, stats and reports that give you Actionable Insights
Deep dive visibility with transaction tracing using APM Insight.
http://ad.doubleclick.net/ddm/clk/290420510;117567292;y
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users

<Prev in Thread] Current Thread [Next in Thread>