shalauras writes:
> I'm making backups from some servers daily, but i have a problem with one
> server.
> The client take up around 50G, but almost all in one directory(49G)
>
> When i try make a backup, this give me an error when trying to copy the
> directory with 49G
> "Out of memory during "large" request .... /usr/...backuppc/.../fileZIO.pm"
Can you include the full error? Also, what versions of zlib and
Compress::Zlib are you using?
How many files are in this directory?
BackupPC stores all the file attribute information for each directory
in a single file, and it needs to be able to decompress and fit that
information in memory. However, it should only be at most a few 100
bytes per file, so one directory would have to have a huge number of
files for this to be a problem.
Another possibility is that some compressed file is corrupted and
it is causing uncompress to break.
Craig
------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
|