BackupPC-users

Re: [BackupPC-users] Backup only new file(s)

2009-06-11 14:44:58
Subject: Re: [BackupPC-users] Backup only new file(s)
From: "Jeffrey J. Kosowsky" <backuppc AT kosowsky DOT org>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Thu, 11 Jun 2009 13:30:04 -0400
Mirco Piccin wrote at about 19:13:42 +0200 on Thursday, June 11, 2009:
 > Hi, thanks for reply.
 > 
 > >  > > Every day (except sunday) a procedure stores in this folder a 120GB 
 > > file.
 > >  > > The name of the file is the day name.
 > >  > >
 > >  > > So, in a week, i have 6 different files generated (about 720 GB).
 > >  > > Every week the files are overwritten by the procedure.
 > >  > >
 > >  > > I'd like to backup only the newest file, and not all the folder.
 > >  > > The problem is that i suppose i must have a full backup of the folder
 > >  > > (720 GB), because of $Conf{FullKeepCnt}  must be >= 1, plus
 > >  > > incremental backup.
 > ...
 > >  > > and so on, for a total of 1440 GB (the double of the effective disk
 > >  > > space needed).
 > ...
 > > Couldn't you just do daily full backups (with no incrementals) while
 > > setting $Conf{FullKeepCnt}=1. Then as long as you made sure that
 > > BackupPC_nightly didn't run in the middle, you would effectively just
 > > be adding one new backup to the pool each day and later when
 > > BackupPC_nightly runs you would be erasing the entry from 8 days
 > > earlier, so you would never have more than 720+120=840 GB in the
 > > pool. Now this wouldn't be particularly bandwidth efficient since you
 > > are always doing full rather than incrementals, but it would work...
 > >
 > > However, if you really are only trying to backup a single new 120GB
 > > file every day, I wonder whether you might be better off just using a
 > > daily 'rsync' cron job. It seems like that would be simpler, more
 > > reliable, and more efficient.
 > >
 > > Also, is each daily file completely distinct from the previous one or
 > > is just incrementally changed? Because if it is just incrementally
 > > changed you may want to first rsync against the previous day's backup
 > > to reduce network bandwidth.
 > 
 > My BackupPC is running on a VIA processor, max MB/s : less than 5 :-(
 > So, backup 840 GB each time is not the best solution ...
 > (this is the reason i did not configure the backup as you suggest)
 > 
 > Anyway, each daily file is quite similar to the each other, so rsync
 > (or custom script) should be the better way to to the job.
 > 

Rsync seems like what you want, especially since there is so much file
similarity. 

------------------------------------------------------------------------------
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

<Prev in Thread] Current Thread [Next in Thread>