BackupPC-users

Re: [BackupPC-users] Comments on this backup plan please

2010-01-26 14:02:07
Subject: Re: [BackupPC-users] Comments on this backup plan please
From: Chris Robertson <crobertson AT gci DOT net>
To: backuppc-users AT lists.sourceforge DOT net
Date: Tue, 26 Jan 2010 10:00:21 -0900
Tino Schwarze wrote:
> Hi,
>
> On Tue, Jan 26, 2010 at 12:22:45PM -0000, PD Support wrote:
>
>   
>> We are going to be backing up around 30 MS-SQL server databases via ADSL to
>> a number of regional servers running CentOS (about 6 databases per backup
>> server). 10 sites are 'live' as of now and this is how we have started...
>>
>> The backups are between about 800MB and 5GB at the moment and are made as
>> follows:
>>
>> 1) A stored procedure dumps the database to SiteName_DayofWeek.bak eg:
>> SHR_Mon.bak
>>
>> 2) We create a local ZIP copy eg: !BSHR_Mon.zip. The !B means the file is
>> EXCLUDED from backing up and is just kept as a local copy, cycled on a
>> weekly basis.
>>
>> 3) We rename SHR_DayofWeek.bak to SiteName.bak
>>
>> 4) We split the .bak file into 200MB parts (.part1 .part2 etc.) and these
>> are synced to the backup server via backuppc
>>
>> This gives us a generically-named daily backup that we sync
>> (backupPC/rsyncd) up to the backup server nightly.
>>
>> We split the files so that if there is a comms glitch during the backing up
>> of the large database file and we end up with a part backup, the next
>> triggering of the backup doesn't have to start the large file again - only
>> the missing/incomplete bits.
>>
>> Although the zip files are relatively small, we have found that their
>> contents varies so much (bit-by-bit wise) on a  weekly cycle basis that they
>> take a long time to sync so we leave them as local copies only.
>>
>> Seems to work OK at the mo anyway!
>>     
>
> You might want to try gzip --rsyncable instead of ZIP and see whether it
> makes a difference. Because of the file splitting etc. I'd add a .md5
> checksum file, just to be sure. Also, there is a tool which name I
> cannot remember which allows you to split a file and generate an
> additional error-correction file, so you get a bit of redundancy and
> chances are higher to reconstruct the archive, even if a part is lost.
>   

http://www.quickpar.org.uk/CreatingPAR2Files.htm

I'm sure there is a command line version as well.

> Disabling compression in BackupPC for these hosts might speed things up
> since the files cannot be compressed anyway.
>
> Tino.
>   

Chris


------------------------------------------------------------------------------
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/