Re: Multi-Gb dumps using tar + software compression (gzip)?
2004-10-20 07:11:11
Michael Schaller wrote:
Hi Toralf,
I'v had nearly the same problem this week.
I found out that this was a problem of my tar.
I backed up with GNUTAR and "compress server fast".
AMRESTORE restored the file but TAR (on the server!) gave some
horrible messages like yours.
I transferred the file to the original machine ("client") and all
worked fine.
I guess this is a problem of different tar versions ...
Do you made your tests on the client or on the server??
If the answer is "server" then transfer the restored archive to your
client and untar there!!
I've tried both. In fact, I've tested just about every combination of
tar, gzip, filesystems, hosts, recovery sources (tape, disk dump,
holding disk...) etc. I could think of, and I always get the same result.
I'm thinking this can't possibly be a tar problem, though, or at least
not only that, since gzip reports errors, too. I get
dd if=00010.raid2._scanner4.7 bs=32k skip=1 | gzip -t
124701+0 records in
124701+0 records out
gzip: stdin: invalid compressed data--crc error
gzip: stdin: invalid compressed data--length error
Greets
Michael
Toralf Lund schrieb:
Since I'm still having problems gunzip'ing my large dumps - see
separate thread, I was just wondering:
Some of you people out there are doing the same kind of thing, right?
I mean, have
1. Dumps of directories containing several Gbs of data (up to roughly
20Gb compressed in my case.)
2. Use dumptype GNUTAR.
3. Compress data using "compress client fast" or "compress server
fast".
If you do, what exactly are your amanda.conf settings? And can you
actually extract *all* files from the dumps?
- Toralf
|
|
|