Re: handling unreasonably large, non-static directories
2006-01-11 19:23:02
On Wednesday 11 January 2006 18:20, Cameron Matheson wrote:
>Hi,
>
>Amanda has been working wonderfully for me ever since I started using
> it about a year ago. I do have one question though, that plagues me
> every time I try to confront it:
>
>I have one directory on one of my boxes that holds files for customers
>(each customer gets a subdirectory (the subdirectory seems to just be
> a customer number... so it's mostly sequential unless a customer gets
> deleted. The size of these directories varies widely (anywhere from
> a few megabytes to 15 gigabytes). All in all there is a little under
> 200GB of data that needs to be backed up. Initially I had just been
> going through the list of directories myself and compiling 15GB
> chunks of them to be backed up, but due to the ever-changing nature
> of these directories it's kind of a pain to keep up w/ that. Is
> there any way I could have amanda automatically split this directory
> up into chunks to be backed up? Or, does anyone else have any keen
> ideas on how one might approach this problem?
>
The only solution that I can think of is a DLE per customer. But if you
have thousands, then I don't know if its been tested at that scale.
One thing it would do is to help isolate the users from each other, and
that can only be good from a security aspect.
Otherwise just make it in groups that would average 1 or 2 GB using a
regex expression. But thats beyond my level of 'expertise'.
>Thanks,
>Cameron Matheson
--
Cheers, Gene
People having trouble with vz bouncing email to me should add the word
'online' between the 'verizon', and the dot which bypasses vz's
stupid bounce rules. I do use spamassassin too. :-)
Yahoo.com and AOL/TW attorneys please note, additions to the above
message by Gene Heskett are:
Copyright 2005 by Maurice Eugene Heskett, all rights reserved.
|
|
|