Amanda-Users

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 06:36:40
Subject: Re: Multi-Gb dumps using tar + software compression (gzip)?
From: Toralf Lund <toralf AT procaptura DOT com>
To: Alexander Jolk <alexj AT buf DOT com>
Date: Tue, 19 Oct 2004 12:30:44 +0200
Alexander Jolk wrote:

Toralf Lund wrote:
  1. Dumps of directories containing several Gbs of data (up to roughly
     20Gb compressed in my case.)
  2. Use dumptype GNUTAR.
  3. Compress data using "compress client fast" or "compress server fast".

If you do, what exactly are your amanda.conf settings? And can you
actually extract *all* files from the dumps?

Yes, I'm doing this, and I've never had problems recovering all files,
just once when the tape was failing.

Good...

 I'll send you my amanda.conf
privately.

OK. Thanks. I don't right a way see any significant differences from what I'm doing, but I'll study it closer...

Oh, there is one thing, by the way: I notice that you use "chunksize 1Gb" - and so do I, right now, but for a while the holding disk data wasn't split into chunks at all, and I've been wondering if that may have been the problem.

 BTW which version are you using?  I'm at version
2.4.4p1-20030716.
I've used the "release" version of 2.4.4p1 for some time, but I'm testing 2.4.4p3 right now.

(I'm doing roughly 500GB a night on two sites, one of them has dumps up
to 80GB compressed, and takes a little less than 24h to finish, after my
exclude lists have been adapted.)

Alex