Re: [BackupPC-users] Out of memory, backuppc_dump with rsyncd
2014-05-07 09:07:41
On Wed, May 7, 2014 at 1:22 AM, Philippe MALADJIAN
<pmaladjian AT hilaire DOT fr> wrote:
>>>>
>>> My backuppc server a 6Go of RAM, I've 118 751 files for 9,1Go of data to
>>> save.
>>>
>> Rsync will send the entire directory tree to the server before both
>> systems walk the lists for the comparison, so it does take some
>> memory, but your numbers look reasonable. Just guessing - could the
>> target system have some other filesystems mounted or perhaps have
>> filesystem corruption or directory links that loop infinitely?
>>
> My structure is :
>
> +-------------+ +-----------------+
> | backuppc | <-- rsyncd --- | saved server |
> +-------------+ +-----------------+
> | |
> NFS (ext3) NFS (ext3)
> | |
> +------------+ +-------------------------------------+
> | /backup | | /www/dev/ on NetApp NAS |
> +------------+ +-------------------------------------+
>
> I do not think that will be a loop because the timeout occurs randomly and
> never on the same file or folder.
Is backuppc walking into the .snapshot directories on the netapp?
Try a 'find . |wc -l' on the path rsyncd exports to get some idea of
the number of directory entries it sees and how long it takes.
--
Les Mikesell
lesmikesell AT gmail DOT com
------------------------------------------------------------------------------
Is your legacy SCM system holding you back? Join Perforce May 7 to find out:
• 3 signs your SCM is hindering your productivity
• Requirements for releasing software faster
• Expert tips and advice for migrating your SCM now
http://p.sf.net/sfu/perforce
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
|
|
|