BackupPC-users

Re: [BackupPC-users] Splitting up large directories.

2010-05-19 14:16:26
Subject: Re: [BackupPC-users] Splitting up large directories.
From: Robin Lee Powell <rlpowell AT digitalkingdom DOT org>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Wed, 19 May 2010 11:14:48 -0700
On Tue, May 18, 2010 at 09:30:43PM +0000, John Rouillard wrote:
> On Tue, May 18, 2010 at 02:04:46PM -0700, Robin Lee Powell wrote:
> > A customer we're backing up has a directory with ~500 subdirs
> > and hundreds of GiB of data.  We're using BackupPC in rsync+ssh
> > mode.
> > 
> > As a first pass at breaking that up, I made a bunch of seperate
> > host entries like /A/*0, /A/*1, ... (all the dirs have numeric
> > names).
> > 
> > That seems to select the right files, but it doesn't work
> > because BackupPC ends up running a bunch of them at once,
> > hammering that customer's machine.
> 
> You can set it up so only a few of those hosts will run at the
> same time using
> 
>   $Conf{UserCmdCheckStatus} = 1;
> 
> and a $Conf{DumpPreUserCmd}/$Conf{DumpPostUserCmd} that know the
> host names and implement a counting semaphore to make sure only
> some number of them are running at the same time. I posted a
> longer sketch of how I limit the number of parallel backups to
> remote sites in the archives some time ago.

That's a fantastic idea!  I don't even need to do anything
complicated; just use "lockfile /tmp/backuppc" OSLT, since I only
care about not overloading single hosts.

Thanks!

-Robin

------------------------------------------------------------------------------

_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/