On 6/27/2015 1:37 AM, Andrew Noonan wrote:
> On Fri, Jun 26, 2015 at 2:17 PM, Ana Emília M. Arruda
> <emiliaarruda AT gmail DOT com> wrote:
>> Are you going to generate a .tar of about 250TB every day? Which will
>> be the nature of your restores? You´re going to need always the
>> restore of the whole data set or occasionally you will need to
>> restore a small set of files?
> After the backfill, I expect the daily backups to be in the hundreds
> of gigs a day range. During the backfill, I'll want to maximize
> writes to catch up as soon as I can. In general, we already have a
> copy of the data, but given how it syncs, this doesn't protect against
> an accidental "rm" getting synced downstream, so I'd imagine other
> then for testing purposes, restores will be for accidental deletes or
> for some sort of massive disaster situation. This is why I figure the
> backfill should be close to the size of the tape, though I'm not 100%.
Keep in mind that one file spanning N tapes is N times more likely to
fail. The loss of a single tape results in the loss of the file. For
this reason, I prefer to break large data sets like that into N
different backup jobs so that a job's full backup fits on a single tape,
if at all possible. Tapes do sometimes fail after they have been
successfully written.
------------------------------------------------------------------------------
Monitor 25 network devices or servers for free with OpManager!
OpManager is web-based network management software that monitors
network devices and physical & virtual servers, alerts via email & sms
for fault. Monitor 25 devices for free with no restriction. Download now
http://ad.doubleclick.net/ddm/clk/292181274;119417398;o
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
|