BackupPC-users

Re: [BackupPC-users] Correct rsync parameters for doing incremental transfers of large image-files

2012-05-12 09:31:10
Subject: Re: [BackupPC-users] Correct rsync parameters for doing incremental transfers of large image-files
From: Tim Fletcher <tim AT night-shade.org DOT uk>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Sat, 12 May 2012 14:28:53 +0100
On 12/05/12 11:57, Andreas Piening wrote:
> Hi Les,
>
> I allready thought about that and I agree that the handling of large 
> image files is problematic in general. I need to make images for the 
> windows-based virtual machines to get them back running when a 
> disaster happens. If I go away from backuppc for transfering these 
> images, I don't see any benefits (maybe because I just don't know of a 
> image solution that solves my problems better).
> As I already use backuppc to do backups of the data partitions (all 
> linux based) I don't want my backups to become more complex than 
> necessary.
> I can live with the amount of harddisk space the compressed images 
> will consume and the IO while merging the files is acceptable for me, too.
> I can tell the imaging software (partimage) to cut the image into 2 GB 
> volumes, but I doubt that this enables effective pooling, since the 
> system volume I make the image from has temporary files, profiles, 
> databases and so on stored. If every image file has changes (even if 
> there are only a few megs altered), I expect the rsync algorithm to be 
> less effective than comparing large files where it is more likely to 
> have a "unchanged" long part which is not interrupted by artificial 
> file size boundaries resulting from the 2 GB volume splitting.
>
> I hope I made my situation clear.
> If anyone has experiences in large image file handling which I may 
> benefit from, please let be know!

The real question is what are you trying to do, do you want a backup (ie 
another single copy of a recent version of the image file) or an archive 
(ie a series of daily or weekly snapshots of the images as they change)?

BackupPC is designed to produce archives mainly of small to medium sized 
files and it stores the full file not changes (aka deltas) and so for 
large files (multi gigabyte in your case) that change each backup it is 
much less efficient.

To my mind if you already have backuppc backing up your data partitions 
and the issue is that you want to back up the raw disk images from your 
virtual machines OS disks the best thing to snapshot them as you have 
already setup and then simply rsync that snapshot to another host which 
will just transfer the deltas between the diskimages. This will leave 
you with backuppc providing an ongoing archive for your data partitions 
and a simple rsync backup for your root disks that will at worse mean 
you lose a days changes in case of a total failure.

-- 
Tim Fletcher<tim AT night-shade.org DOT uk>


------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

<Prev in Thread] Current Thread [Next in Thread>