Hi,
Jeffrey J. Kosowsky wrote on 2009-04-08 17:58:04 -0400 [Re: [BackupPC-users]
How do the files look like when transferred?]:
> John Rouillard wrote at about 20:55:14 +0000 on Wednesday, April 8, 2009:
> [...]
> > For extra credit a:
> >
> > BackupPC_import -H hostname -s share directory
> >
> > that takes a file tree located at directory and imports that filetree
> > as though it was a backup done for the share "share" on hostname would
> > be great as well. Maybe a Google Summer Of Code idea?
>
> More like Afternoon of Code - I mean it's not very hard.
I was thinking much the same, but I suddenly realized there should be a really
simple way to do this right now without much coding. It's untested, but I will
be needing it myself sometime soon, so I might as well write it down.
1. Let's suppose you've got a host named 'remote', of which you want to back
up '/some/path/to/share' via 'rsync' (it really shouldn't matter what
share name and transfer method you're going to use). So you'd have
remote.pl containing something like:
$Conf {XferMethod} = 'rsync';
$Conf {RsyncShareName} = [ '/some/path/to/share' ];
$Conf {RsyncClientCmd} = '$sshPath -q -x $host /usr/bin/sudo $rsyncPath+
$argList+';
$Conf {BackupFilesExclude} = {
'/some/path/to/share' => [ '/.Trash', '/.thumbnails' ],
};
Yes, "$rsyncPath+" is being pedantic ;-).
2. You've got a local copy on '/mnt/cdrom/import' on the BackupPC server,
which you want to import (that means remote:/some/path/to/share/dir/file
appears at /mnt/cdrom/import/dir/file). Let's try adding this to remote.pl:
$Conf {XferMethod} = 'tar';
$Conf {TarShareName} = [ '/some/path/to/share' ];
$Conf {TarClientCmd} = '/usr/bin/sudo $tarPath -c -v -f - -C
/mnt/cdrom/import --totals';
$Conf {TarFullArgs} = '$fileList';
$Conf {PingCmd} = '&{sub {0}}';
$Conf {BackupFilesExclude} = {
'/some/path/to/share' => [ '/.Trash', '/.thumbnails' ],
};
Note that TarClientCmd makes use of neither $host nor $shareName - nobody
ever said it had to. Note also that BackupFilesExclude is identical. So you
basically just add the tar configuration and the PingCmd and temporarily
change the XferMethod to tar. Then run the (full!) backup. Then change
XferMethod back, remove the PingCmd, and all should be set for remote
backups.
(Actually, the PingCmd will need to either work for the remote host or be
disabled anyway. It's just that the remote host doesn't need to be currently
available for the local backup to be attempted.)
This only works (well, should work, I think) for backups containing a single
share though. For more than one share, you'd need a wrapper script along the
lines of ...
#!/bin/sh
share=$1; shift
case $share in
'/share/name/1') sudo /bin/tar -c -v -f - -C /local/first/share --totals $*
;;
'/share/name/2') sudo /bin/tar -c -v -f - -C /somewhere/else --totals $*
;;
'/share/name/3') sudo /bin/tar -c -v -f - -C /and/so/on --totals $*
;;
*) echo "Oops, unknown share $share !?"
;;
esac
... and
$Conf {TarClientCmd} = '/path/to/script $shareName';
Of course, if your local copy is on 'localcopyhost' instead of the BackupPC
server itself, you can always add a '$sshPath -q -x localcopyhost' as needed.
You could even use a different host for each share if you need to.
Why didn't I think of this before? It seems rather obvious. But I'd like to
know if it works just the same. If you don't test it, I will soon ;-).
Regards,
Holger
------------------------------------------------------------------------------
This SF.net email is sponsored by:
High Quality Requirements in a Collaborative Environment.
Download a free trial of Rational Requirements Composer Now!
http://p.sf.net/sfu/www-ibm-com
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
|