BackupPC-users

Re: [BackupPC-users] Block-level rsync-like hashing dd?

2011-04-12 17:22:00
Subject: Re: [BackupPC-users] Block-level rsync-like hashing dd?
From: "Jeffrey J. Kosowsky" <backuppc AT kosowsky DOT org>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Tue, 12 Apr 2011 17:20:07 -0400
Timothy J Massey wrote at about 15:43:28 -0400 on Tuesday, April 12, 2011:
 > Les Mikesell <lesmikesell AT gmail DOT com> wrote on 04/11/2011 10:55:21 AM:
 > 
 > > On 4/11/2011 12:43 AM, Saturn2888 wrote:
 > > > But none of that solves the issue we're having now.
 > > 
 > > None of what?  It is hard to understand things with no context.
 > > 
 > > > How in the world do we backup the current pool of data?
 > > 
 > > Any of the ways that you can do an image copy of the raw partition or 
 > > disk holding the archive will work as a backup.  But in many cases the 
 > > best approach is to simply run an independent copy of backuppc from a 
 > > different location, connecting to the same targets over a WAN or VPN, 
 > > perhaps with the blackout periods skewed to avoid running at the same 
 > times.
 > 
 > Can someone *PLEASE* make this a sticky on the God-forsaken forum that 
 > cross-posts here?
 > 
 > To slightly expand what Les wrote:  there are 4 realistic options (for a 
 > very loose definition of "realistic"):
 > 
 > 1) rsync the pool.
As noted many times before, this fails and/or thrashes for large
archives due to the humongous number of hard links
 > 2) LVM Snapshot/dd
 > 3) Break a RAID array
 > 4) Run two separate BackupPC servers, both backing up the same server.
 I think #4 is underappreciated given how cheap hardware is
nowadays. For the 2nd "backup" version, you can potentially get by
with less frequent runs say perhaps just 1 full a week. Of course, you
won't have as granular a series of fulls & incrementals, but since
this is a reserve backup, you may be satisfied with that. And if not
then once every longer period of time you can do a bit copy of your
primary BackupPC partition if you want. I end up using a lowly 1.2GHz
 Arm CPU plug computer with just 500MB memory and 500 MB flash plus a
1TB USB external drive to serve as my 2nd backup.

#5) Use either the included BackupPC_tarPCCopy or my potentially
 faster BackupPC_copyPcPool script to copy over the pool and pc trees
 at the file level (this will work for archives where rsync thrashes)

 > 
 > Use Google for further details.
AGREED - this topic has been discussed ad-nauseum on the list to date...

------------------------------------------------------------------------------
Forrester Wave Report - Recovery time is now measured in hours and minutes
not days. Key insights are discussed in the 2010 Forrester Wave Report as
part of an in-depth evaluation of disaster recovery service providers.
Forrester found the best-in-class provider in terms of services and vision.
Read this report now!  http://p.sf.net/sfu/ibm-webcastpromo
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

<Prev in Thread] Current Thread [Next in Thread>