Hello.
As keeping multiple full backups of the whole data is a very expensive task, I
think it's wise to minimize the size of full backups. The simple idea for
doing that is to separate files/folders into "active" and "inactive" ones.
Active files/folders would then get backed up into multiple full volumes, but
inactive files would be held in one copy only, eg. by doing only incremental
backups. If the amounts are smth like 1TB for active files and 2TB for
inactive files, the save would be noticeable (2TB times the number of full
backups) :)
The question about Bacula is that is there any way for achieving this without
scripting filesets? Currently I've done it with simple find-script, but lately
I tried it on a server having about 1TB of data - when I let the script to
exclude every old file from the fileset, the incremental job would last about
16 hours, even though the backup itself was only 1GB. I guess it's because the
enormous amount of old files.
I've been trying to create a smart ruby-script that would build the optimal
list of all the old files/folders to be excluded, but it turned out to be much
more difficult I initially expected.
Is anyone else doing smth like that?
--
Silver
------------------------------------------------------------------------------
Register Now & Save for Velocity, the Web Performance & Operations
Conference from O'Reilly Media. Velocity features a full day of
expert-led, hands-on workshops and two days of sessions from industry
leaders in dedicated Performance & Operations tracks. Use code vel09scf
and Save an extra 15% before 5/3. http://p.sf.net/sfu/velocityconf
_______________________________________________
Bacula-users mailing list
Bacula-users AT lists.sourceforge DOT net
https://lists.sourceforge.net/lists/listinfo/bacula-users
|