BackupPC-users

Re: [BackupPC-users] Correct rsync parameters for doing incremental transfers of large image-files

2012-05-13 17:05:47
Subject: Re: [BackupPC-users] Correct rsync parameters for doing incremental transfers of large image-files
From: Andreas Piening <andreas.piening AT gmail DOT com>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Sun, 13 May 2012 23:04:21 +0200
Hi Les,

excuse my floppy frase: Real time recovery is not what I'm looking for. I ment I want to be able to get the system into a workable state simply by downloading and restoring from an image. If I need to manually assemble the system disk from a file based backup and fiddle around with a non-bootable and hard to debug windows server that's not a solution for me. Repairing a broken windows installation is an extremely time consuming pain compared to react on some "file not found" or "wrong path" error messages from a linux-os which directly leads to the problem.

I understand your concept with restoring a (not necessary up-to-date) image and doing a file based restore on top, but I have never tried this and don't feel so comfortable with this.
Please can you tell me more concrete about the windows-version you did this with and did you use the "normal" ntfs file-system driver or ntfs-3g while doing the file-based "overlay" restore?
How often have you done this successfully or better have you ever had problems with file permissions, lost attributes or anything else? Or have you done additional steps for getting the system drive bootable again?

Thank you very much,

Andreas

Am 12.05.2012 um 16:57 schrieb Les Mikesell:

On Sat, May 12, 2012 at 9:17 AM, Andreas Piening
<andreas.piening AT gmail DOT com> wrote:
I want a backup that gives me the opportunity to get the server back and running within a few minutes + download time of the image + restore time from partimage.
It is ok to loose the files created since the backup-run last night, I see this more as a "live insurance". The documents are already backed up with daily and weekly  backup-sets via backuppc.

If you need close to real-time recovery, you need to have some sort of
live clustering with failover.  Just copying the images around will
take hours.  For the less critical things, I normally keep a 'base'
image of the system type which can be a file for a VM, or a clonezilla
image for hardware that can be used as a starting point but I don't
update those very often.   Backuppc just backs up the live files
normally for everything so for the small (and more common) problems
you can grab individual files easily and for a disaster you would
start with the nearest image master and do a full restore on top of
it, more or less the same as with hardware.

-- 
  Les Mikesell
    lesmikesell AT gmail DOT com

------------------------------------------------------------------------------
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
<Prev in Thread] Current Thread [Next in Thread>