BackupPC-users

Re: [BackupPC-users] Large Linux backup Help

2010-02-09 14:54:27
Subject: Re: [BackupPC-users] Large Linux backup Help
From: Adam Goryachev <mailinglists AT websitemanagers.com DOT au>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Wed, 10 Feb 2010 06:53:00 +1100
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Colin Blower wrote:
> Hello everyone,
>
> I was hoping for a solution to my latest problem and also advice on
> my general backup. My latest problem is the backup seems to be
> running, but stalled.
>
> I see the ssh process on the server and the rsync process on the
> client, but in the web interface the Xfer PID has only one PID. It
> has been this way for ~12 hours.
>
> Background: I am in the process of backing up a very large (~1TB,
> maybe 12million files?) filesystem using backuppc. It is a linux
> setup using rsync. Originally the RsyncShareName was '/home', but
> that back up failed to transfer any files after an Out of Memory
> error and the server killed the backup process.
>
> I split up the RsyncShareName into
> '/home/aa','/home/ab',...,'/home/zy',/home/zz'. Ran another backup,
>  ran into a filename too long error and removed that weird
> directory from the client.
>
> The latest backup has been running for 24 hours, but for the last
> 12 has been stuck on /home/hm. This directory is one of the empty
> directories and should not take more than a second to complete.
>
> Could this be related to the ~400 zombie processes this backup has
> created? Also, if i were to stop this full backup, should i try
> again with another full or an incremental ? ( i have no full
> backups in backuppc )
I would advise you split this into different hosts, where each host
has only aa-az or ba-bz etc ...
This way, when a share fails to backup for some reason, only a small
portion of the overall backup will fail instead of the whole lot.
Use ClientNameAlias (from memory) to point hostname-a, hostname-b,
hostname-c, etc to hostname.

Also check your number of concurrent backups is low, you don't want
multiple backups to the same 'host' at the same time. Some people have
written scripts which will check this and cause the backup to fail for
the current backup attempt if there is already a backup running. (See
the archives/google with keywords concurrent, lock, backup, etc or ask
and someone might have a better pointer).

Regards,
Adam
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAktxvRsACgkQGyoxogrTyiVm5QCfT6VS9/s4pZYc6PgpW30pGHRh
H4gAoI5nRH8wWhwdTvlyIBifWe5fLL0E
=NHUe
-----END PGP SIGNATURE-----


------------------------------------------------------------------------------
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

<Prev in Thread] Current Thread [Next in Thread>