BackupPC-users

[BackupPC-users] Large file stalls backup

2008-04-09 09:20:38
Subject: [BackupPC-users] Large file stalls backup
From: David Birnbaum <davidb AT pins DOT net>
To: backuppc-users AT lists.sourceforge DOT net
Date: Wed, 9 Apr 2008 09:20:13 -0400 (EDT)
Greetings,

I've been using BackupPC for several years now, but one problem that I've never 
come up with a good answer for is when a single large file is too big to 
transfer completely in the time the backup can run before timing out.  For 
example, a 10M local datafile, backing up over a 768k upstream DSL, ends up 
stalling the backup because it can never get past that file.

Does anyone have a workaround or fix for this?  Is it possible to change 
BackupPC so it doesn't remove the in-progress file, but instead copies it into 
the pool so rsync will pick up where it left off last time?  There doesn't seem 
to be any downside to leaving off on the transfer where it was.

There is one other problem related to this (and big backups in general) - 
sometimes, there is enough delay that one of the rsync connections will timeout 
due to the firewall not seeing activity on the socket.  Is there a way to force 
some sort of traffic (a bogus keepalive) on the which socket is idle to make 
sure it doesn't get prematurely severed?

Any advice would be appreciated.

David.

-------------------------------------------------------------------------
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/