BackupPC-users

Re: [BackupPC-users] rsync never starts transferring files (but does something)

2012-12-05 14:07:38
Subject: Re: [BackupPC-users] rsync never starts transferring files (but does something)
From: Markus <universe AT truemetal DOT org>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Wed, 05 Dec 2012 20:06:01 +0100
Am 05.12.2012 17:47, schrieb Les Mikesell:
> On Wed, Dec 5, 2012 at 8:12 AM, Timothy J Massey <tmassey AT obscorp DOT com> 
> wrote:
>>
>> Wow.  25 *million* files saved in home directories?  That kind of defeats 
>> the purpose of shared data!  I thought my users were bad about that...  :)
>
> Probably mostly browser-cache files that don't need to be backed up...
> Or checked-out workspaces from the same VCS.

Actually, this is a Dedicated Server on the internet and the customer 
hosts a couple thousands customers' websites, most of these files are 
generated by a statistics / SEO software. All files are relevant 
(unfortunately).

BTW, the backup ran for 9.8 days but didn't finish properly for some 
reason. In the logs for the day that it got interrupted I just have 
hundreds of "Botch on admin job" messages in the log for that client.

I changed ClientTimeout to 2880000 just in case and am waiting for the 
next backup to get started.

I've got a partial backup with 15 million files in 1.3 TB now (of about 
20 million, got it reduced a bit).

Afterwards, BackupPC_link ran but never finished according to the logs. 
I've got a "Running BackupPC_link" log entry for that client but no 
"Finished" entry. However, there is a BackupPC_link command queued for 
that client, so I guess it's trying again.

BTW, Les, about the previous message:

"'defunct' processes normally have exited and cleaned up their
resources except that there is an entry left in the process table
until the parent process wait()s to collect the status.  Are you sure
it wasn't some still-running process consuming the memory?"

For every share one (two, actually: rsync+ssh) defunct process is 
created once the backup of that share was done and while the "overall" 
backup is still running (while other shares are still getting backed up).

I've tried it again and I got the same effect again. I guess it's just 
not designed to backup hundreds or even thousands of shares of a single 
client (in a single backup run). But I still believe that if the 
defunct' processes weren't created my script-attempt (1 directory on the 
client = 1 share in the clients' config) would work! (and possibly much 
faster?)

I think you can re-create what I describe by backing up a client with 
several shares and then just watching 'ps aux' once the first share got 
backed up. You will see new defunct processes for every share until the 
whole backup finishes.

Unfortunately I don't know how I can confirm that these defunct 
processes still consume CPU/memory except for the fact that the whole 
backup crashes with the beforementioned out-of-memory message once a few 
hundred shares have been backed up. 'ps' is telling me that the defunct 
processes still consume CPU, though. Memory is just at "0". So, don't 
know if 'ps' is trustworthy there.

Thanks!
Markus


------------------------------------------------------------------------------
LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
Remotely access PCs and mobile devices and provide instant support
Improve your efficiency, and focus on delivering more value-add services
Discover what IT Professionals Know. Rescue delivers
http://p.sf.net/sfu/logmein_12329d2d
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/