BackupPC-users

Re: [BackupPC-users] Slow Rsync Transfer?

2011-04-29 14:23:28
Subject: Re: [BackupPC-users] Slow Rsync Transfer?
From: "Dan Lavu" <Dan.Lavu AT rivermine DOT com>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Fri, 29 Apr 2011 14:07:01 -0400
Resolved. 

After looking at the file list, we found a 102GB log file, rsync doesn't like 
large files and there are a ton of threads about why. 

Troubleshooting steps that were taken that actually isolated the issue 

strace -p $PID (The output look like it was catting the file) 
lsof -f | grep rsync (and the following to confirm)

I hope this helps anybody else who might have this issue. 

Cheers,

 
_______________________________________________
Dan Lavu
System Administrator  - Emptoris, Inc.
www.emptoris.com Office: 703.995.6052 - Cell: 703.296.0645


-----Original Message-----
From: Adam Goryachev [mailto:mailinglists AT websitemanagers.com DOT au] 
Sent: Friday, April 29, 2011 12:46 AM
To: backuppc-users AT lists.sourceforge DOT net
Subject: Re: [BackupPC-users] Slow Rsync Transfer?

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On 29/04/11 04:08, Dan Lavu wrote:
> Gerald,
> 
>  
> 
> Not the case with me, if you look at the host ras03, you see that the 
> average speed is .92MB/s while other host are significantly faster. It 
> is taking 40 hours to do 110GB, while other hosts are doing it in 
> about an hour. I’m about to patch this box and reboot it, it’s been up 
> for
> 200+ days and I haven’t had a good backup for over a week now. So any
> input will be helpful, again thanks in advance.

One thing I've seen which can really slow down rsync backups is that a large 
file with changes will be much slower to backup than a number of small files 
(of the same total size) with the same amount of changes.

I backup disk images, original method was to just backup the image, but this 
was too slow. New method is:
use split to divide the file into a series of 20M or 100M files backup these 
individual files

I also do the same with database exports and other software backup files more 
than around 100M ... it just backup quicker, and also a failed backup will 
continue from the most recent chunk (in a full backup) instead of restarting 
the whole file. Also, timeout is shorter because it is reset after each chunk.

Regards,
Adam

- --
Adam Goryachev
Website Managers
www.websitemanagers.com.au
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.11 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAk26QpkACgkQGyoxogrTyiXMlgCgghJ14sMasOdtJi28os6rBj4U
GeYAnRxasxrFgpSZ442w0+HKDNHJFsZZ
=d8vA
-----END PGP SIGNATURE-----

------------------------------------------------------------------------------
WhatsUp Gold - Download Free Network Management Software The most intuitive, 
comprehensive, and cost-effective network management toolset available today.  
Delivers lowest initial acquisition cost and overall TCO of any competing 
solution.
http://p.sf.net/sfu/whatsupgold-sd
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
------------------------------------------------------------------------------
WhatsUp Gold - Download Free Network Management Software
The most intuitive, comprehensive, and cost-effective network 
management toolset available today.  Delivers lowest initial 
acquisition cost and overall TCO of any competing solution.
http://p.sf.net/sfu/whatsupgold-sd
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/