Bacula-users

Re: [Bacula-users] Large maildir backup

2008-11-27 08:19:33
Subject: Re: [Bacula-users] Large maildir backup
From: Boris Kunstleben onOffice Software GmbH <b.kunstleben AT onoffice DOT de>
To: Alan Brown <ajb2 AT mssl.ucl.ac DOT uk>
Date: Thu, 27 Nov 2008 14:17:24 +0100
Hi Alan,

any idea if there is a better filesystem, im using ext3 on the clients and xfs 
on the director

Kind Regards Boris Kunstleben



-- 
--------------------------------------------------------------------------------------
onOffice Software GmbH
Feldstr. 40
52070 Aachen
Tel. +49 (0)241 44686-0
Fax. +49 (0)241 44686-250
Email: b.kunstleben AT onOffice DOT com
Web: www.onOffice.com
--------------------------------------------------------------------------------------
Registergericht: Amtsgericht Aachen, HRB 12123
Geschäftsleitung: Stefan Mantl, Torsten Kämper, Stefan Becker
--------------------------------------------------------------------------------------

----- Ursprüngliche Nachricht -----
Von: Alan Brown <ajb2 AT mssl.ucl.ac DOT uk>
Gesendet: Donnerstag, 27. November 2008 12:46:24
An: Boris Kunstleben onOffice Software GmbH <b.kunstleben AT onoffice DOT de>
Cc: <bacula-users AT lists.sourceforge DOT net>
Betreff: Re: [Bacula-users] Large maildir backup

On Thu, 27 Nov 2008, Boris Kunstleben onOffice Software GmbH wrote:

> i am doing exactly that since last Thursday.
> I have about 1.6TB in Maildirs and an huge number of small files. I have to 
> say it is awfull slow. Backing up a directory with about 190GB of Maildirs 
> took "Elapsed time: 1 day 14 hours 49 mins 34 secs".
> On the other hand i have a server with Documents and images (about 700GB) 
> took much less time.
> All the Servers are virtuall Enviroments (Virtuozzo).
>
> Any Idess would be appreciated.

I have filesystems here of simlar sisze with wildly varying file sizes.

The 1Tb partition (80% full) with 8000 files in it backs up quickly

The 1Tb partition (50% full) with 7 million files in it takes 5 times
longer.

There is a fixed filesystem time cost of opening each file and therefore
the smaller the file the lower the average throughput - having said that
most filesystems get SLOW when there are thousands of files in one
directory.

AB



-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users