On Wed, Jun 18, 2008 at 11:36 AM, mark k <
mkopenski AT gmail DOT com> wrote:
> I have set up a backup job that way, and split it up into 15
> individual jobs, it is still a lot of files though.
>
> I am going to try out sending a SIGSTOP and a SIGCONT to the rsync job
> on the client and see if it works. If it does, I don't think that
> would be very hard to incorporate into BackupPC interface.
>
>
>
> On Wed, Jun 18, 2008 at 11:26 AM, Les Mikesell <
lesmikesell AT gmail DOT com> wrote:
>> mark k wrote:
>>>
>>> Wondering if anyone one has found away to pause or suspend a backup
>>> job instead of stopping
>>>
>>> I am backing up several servers with large luns 750 gb to 2tb in size.
>>> With millions of tiny files.
>>>
>>> So everytime a new rsync job kicks off it has to rebuild the file list
>>> and the only time the backup job completes is on the weekends, when
>>> the jobs can run 24/7
>>> can only run about 8 hours a day on the weekdays.
>>>
>>> Is there a way to pause the rsync on client and then resume it later,
>>> so that at least it will get a base full over a couple of nights and
>>> then incrementals should work normally after that.
>>
>> I don't think this is possible, but you might want to look at the file
>> distribution on the target. If you could split this into some number of
>> directories that are backed up separately and perhaps a catch-all run that
>> excludes the ones backed up individually it might go a lot faster. Also, it
>> might help a lot to add RAM to the server or run fewer concurrent jobs if it
>> is swapping due to the size of the directory. If you have a current version
>> of backuppc, it should save partial full runs and accumulate parts until it
>> is completed. However, even incrementals will have to transfer the entire
>> directory structure over before starting so it may continue to be a problem.
>>
>> --
>> Les Mikesell
>>
lesmikesell AT gmail DOT com
>>
>>
>>
>
>
>
> --
> Walt Disney - "I love Mickey Mouse more than any woman I have ever known."
>
--