Hi all,
can you give me some tips for backing up an filesystem with
many many small files on it ?
The problem is, that i must backup mailservers with about
300 GB and more than 1-2 million of many small files (Maildir).
The File Table in the bacula database grows up to more than
3GB and the backups takes more than 30 hours.
Is this an good time, or is there an way to optimize this
eg. to run more cuncurrent jobs than one on the client ?
When i turn on concurrent jobs must i enable data spooling ?
This makes it impossible to restore single files "quick".
When you need more information, please let me know.
Many thanks and greetings,
Daniel Betz
Platform
Engineer
___________________________________
domainfactory GmbH
Oskar-Messter-Str. 33
85737 Ismaning
Germany
Telefon: +49 (0)89 / 55266-364
Telefax: +49 (0)89 / 55266-222
E-Mail: dbetz AT df DOT eu
Internet: www.df.eu
Registergericht: Amtsgericht München
HRB 150294, Geschäftsführer Tobias
Marburg, Jochen Tuchbreiter