Everyone,
I am running a backuppc installation with a pool around about 3.5TB. The
biggest share in this pool is around 2TB and is located on the machine
itself, so I use the tar method on localhost to back it up. When I do
full backups of this share it takes around 4 days to complete and it
seems to be only creating around about 10 small files every 20 seconds
in the pc/localhost/new directory. This seems very slow.
It is not a disk problem or hardware throughput problem because a) I'm
not using compression and b) and there is no disk activity. The CPU
usage for BackupPC_tarExtract is constantly at 100% or thereabouts. The
tar process that is creating the tar file is largely idle.
The disk system is a RAID5-EE with 13 non-parity disks and it is
formatted to ext3.
So my question is twofold: What could be causing this problem? Is there
a way of profiling the script so that a bottleneck could be identified?
Thanks in advance,
Chris.
--
Dr Christopher Adamson, PhD (Melb.), B Software Engineering (Hons., Monash)
Research Officer
Developmental and Functional Brain Imaging, Critical Care and Neurosciences
Murdoch Childrens Research Institute
The Royal Children’s Hospital
Flemington Road Parkville Victoria 3052 Australia
T 9345 4306
M XXXX XXX XXX
E chris.adamson AT mcri.edu DOT au
www.mcri.edu.au
------------------------------------------------------------------------------
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
|