Bacula-users

Re: [Bacula-users] Backup of many many files

2008-09-03 10:28:41
Subject: Re: [Bacula-users] Backup of many many files
From: "T. Horsnell" <tsh AT mrc-lmb.cam.ac DOT uk>
To: bacula-users AT lists.sourceforge DOT net
Date: Wed, 03 Sep 2008 15:42:09 +0100
Kjetil Torgrim Homme wrote:
> Chris Howells <chris AT transitive DOT com> writes:
> 
>>I suggest that you try this first:
>>
>>time tar cf /dev/null /the/directory/you/are/backing/up
>>
>>Then you should be able to work out whether its the Netapp limiting
>>the back up speed, rather than bacula.
> 
> 
> be careful with this test, at least some versions of GNU tar will
> optimise away the reading of the files when the output is directed at
> /dev/null!  use /dev/zero to be sure.
> 

Seconded. You might even try

time tar cf /dev/your_backup_device /the/directory/you/are/backing/up

if you have an opportunity. This would help isolate bacula overheads. 
Make sure you use a nice big blocksize (the -b option in tar), say 128 
or 256.

FYI, I just backed up 36million files, 6.2Tbytes to an lto4 in 2days 8 
hours. The filesystem being backed up, and the tapedrive, both live on 
the same host, and the system was doing nothing else. The catalog now 
occupies about 12Gbytes and contains indexes for 2 full backups.

I havent (yet) tried the 'tar -cf /dev/zero' experiment :)

Cheers,
Terry

-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users