BackupPC-users

Re: [BackupPC-users] HowTo backup __TOPDIR__?

2009-08-06 16:34:15
Subject: Re: [BackupPC-users] HowTo backup __TOPDIR__?
From: "Pedro M. S. Oliveira" <pmsoliveira AT gmail DOT com>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Thu, 6 Aug 2009 21:30:57 +0100
while reading linux journal look at what i found... maybe it will do what you want

You can use the dd and nc commands for exact disk mirroring from one server to another. The following commands send data from Server1 to Server2:

Server2# nc -l 12345 | dd of=/dev/sdb
Server1# dd if=/dev/sda | nc server2 12345

Make sure that you issue Server2's command first so that it's listening on port 12345 when Server1 starts sending its data.

Unless you're sure that the disk is not being modified, it's better to boot Server1 from a RescueCD or LiveCD to do the copy.

source:

http://www.linuxjournal.com/content/tech-tip-remote-mirroring-using-nc-and-dd
http://www.linux-geex.com
__________________________


On Thu, Aug 6, 2009 at 9:19 PM, Matthias Meyer <matthias.meyer AT gmx DOT li> wrote:
Thomas Birnthaler wrote:

>> What is the best way to "syncronize" __TOPDIR__ to another location?
>> As I found in many messages, rsync isn't possible because of
>> expensive memory usage for the hardlinks.
> Since version 3.0.0 (protocol 3 on both ends) rsync uses an
> "incremental" mode to generate and compare the file lists on both sides.
> So memory usage decreased a lot, because just a small part of the list
> is in memory all the time. But the massive hardlink usage of BackupPC
> causes very slow copying of the whole structure, because link creation
> on any filesystem seems to be a very expensive task (locks?) ...
>

So it seems possible to make the initial remote backup by dd and after this
a daily rsync?
Because on a daily basis are (hopefully) not tons of new hardlinks which
have to be created on the remote side.

>> In my opinion dd or cp -a isn't possible too because they would copy
>> all the data. That would consume to much time if I syncronize the
>> locations on a daily basis.
> Any other tool has the same time consumption if it keeps hardlinks
> ("cp" e.g. does that with option "-l").
>
> A somehow "lazy" solution would be to just copy the "pool"-Files (hashes
> as file names) by "rsync" and create a "tar" archive of the "pc"
> directory.

I would believe that creating of the tar archive and copying it to the other
location will consume nearly the same time, space and bandwidth as dd or
cp. Isn't it?

> The time consuming process of link creation is then deferred
> to the restore case (which may never be needed).
>
> Thomas

br
Matthias
--
Don't Panic


------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
trial. Simplify your report design, integration and deployment - and focus on
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
<Prev in Thread] Current Thread [Next in Thread>