BackupPC-users

[BackupPC-users] Rsync Xfer with large file system

2016-03-23 17:39:03
Subject: [BackupPC-users] Rsync Xfer with large file system
From: Laverne Schrock <schr1230 AT morris.umn DOT edu>
To: backuppc-users AT lists.sourceforge DOT net
Date: Wed, 23 Mar 2016 16:22:39 -0500
Hi,

I have a file system that I am trying to backup with BackupPC using the rsync Xfer method. We are running BackupPC version 3.2.1. File server has rsync version 3.0.9. The logs show that the transfer uses protocol 28.

When we tried to run the backup, it started successfully, and then died after around 8 hours. The only error message that could be found was one in /var/log/BackupPC/LOG that said "out of memory".

From what I've read this is most likely caused by rsync running out of memory when it is building the in memory file list. Here are the stats on the file system. It contains 924G of data as reported by `df`. Running `find . | wc -l` reports 31983221 files. Some have suggested breaking the backup down into several runs. This is a home folder shared via NFS, so there is no easy division here.

Is there anything we can do to make this work with the rsync method, or is this just too much? I'd really prefer it over tar.

Thanks,

L. Schrock
------------------------------------------------------------------------------
Transform Data into Opportunity.
Accelerate data analysis in your applications with
Intel Data Analytics Acceleration Library.
Click to learn more.
http://pubads.g.doubleclick.net/gampad/clk?id=278785351&iu=/4140
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
<Prev in Thread] Current Thread [Next in Thread>