Re: [BackupPC-users] Backup backuppc pool with rsync offsite
2011-02-23 09:04:46
Timothy J Massey wrote at about 23:13:52 -0500 on Tuesday, February 22, 2011:
> Dennis Blewett <dennis.blewett AT gmail DOT com> wrote on 02/22/2011
> 10:17:29 PM:
>
> > 13,849 items, totalling 3.8 GB
> >
> > It would appear that I have a feasible number of files. I'm not sure
> > how many more files I will have by the end of April, though.
> >
> > I've read about that "rsync -H" would be a practical command to use
> > on the backuppc folder.
> >
> > What I'm also curious about is if I should be rsyncing any other
> > files, thus allowing me to restore from the offsite backup in the
> > case I lose everything and rebuild a backuppc configuration: I would
> > attempt to rsync back to my computer with the new backuppc
> > configuration and attempt to restore/recover said files.
>
> Please Google this and read the results: this question has been asked
> dozens of times, and always boils down to the same points. I will sum it
> up *briefly*:
>
> * For most people, rsync does not work to replicate a backup server
> effectively. Period. I think *no* one would suggest this as a reliable
> ongoing method of replicating a BackupPC server. Ever.
>
> * The best methods for this boil down to two camps:
> 1) Run two BackupPC servers and have both back up the hosts
> directly
> No replication at all: it just works.
> 2) Use some sort of block-based method of replicating the data
>
> * Block-based replication boils down to two methods
> 1) Use md or dm to create a RAID-1 array and rotate members of
> this array in and out
> 2) Use LVM to create snapshots of partitions and dd the partition
> to a different drive
> (I guess 3) Stop BackupPC long enough to do a dd of the partition
> *without* lVM)
>
I think there is a 3rd camp:
3. Scripts that understand the special structure of the pool and pc
trees and efficiently create lists of all hard links in pc
directory.
a] BackupPC_tarPCCOPY
Included in standard BackupPC installations. It uses a perl
script to recurse through the pc directory, calculate (and
cache if you have enough memory) the file name md5sums and
then uses that to create a tar-formatted file of the hard
links that need to be created. This routine has been
well-tested at least on smaller systems.
b] BackupPC_copyPcPool
Perl script that I recently wrote that should be significantly
faster than [a], particularly on machines with low memory
and/or slower cpus. This script creates a new temporary
inode-number indexed pool to allow direct lookup of links and
avoid having to calculate and check file name md5sums. The
pool is then rsynced (without hard links -- i.e. no -H flag)
and then the restore script is run to recreate the hard
links. I recently used this to successfully copy over a pool of
almost 1 million files and a pc tree of about 10 million files.
See the recent archives to retrieve a copy.
------------------------------------------------------------------------------
Free Software Download: Index, Search & Analyze Logs and other IT data in
Real-Time with Splunk. Collect, index and harness all the fast moving IT data
generated by your applications, servers and devices whether physical, virtual
or in the cloud. Deliver compliance at lower cost and gain new business
insights. http://p.sf.net/sfu/splunk-dev2dev
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
|
|
|