Multi-Gb dumps using tar + software compression (gzip)?
2004-10-19 03:06:12
Since I'm still having problems gunzip'ing my large dumps - see separate
thread, I was just wondering:
Some of you people out there are doing the same kind of thing, right? I
mean, have
1. Dumps of directories containing several Gbs of data (up to roughly
20Gb compressed in my case.)
2. Use dumptype GNUTAR.
3. Compress data using "compress client fast" or "compress server fast".
If you do, what exactly are your amanda.conf settings? And can you
actually extract *all* files from the dumps?
- Toralf
|
|
|