Amanda-Users

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 09:45:50
Subject: Re: Multi-Gb dumps using tar + software compression (gzip)?
From: Michael Schaller <bwkstuttgart AT yahoo DOT de>
To: amanda-users AT amanda DOT org
Date: Tue, 19 Oct 2004 15:39:57 +0200
Hi Toralf,

I'v had nearly the same problem this week.
I found out that this was a problem of my tar.
I backed up with GNUTAR and "compress server fast".
AMRESTORE restored the file but TAR (on the server!) gave some horrible messages like yours. I transferred the file to the original machine ("client") and all worked fine.
I guess this is a problem of different tar versions ...

Do you made your tests on the client or on the server??
If the answer is "server" then transfer the restored archive to your client and untar there!!

Greets
Michael

Toralf Lund schrieb:
Since I'm still having problems gunzip'ing my large dumps - see separate thread, I was just wondering:

Some of you people out there are doing the same kind of thing, right? I mean, have

  1. Dumps of directories containing several Gbs of data (up to roughly
     20Gb compressed in my case.)
  2. Use dumptype GNUTAR.
  3. Compress data using "compress client fast" or "compress server fast".

If you do, what exactly are your amanda.conf settings? And can you actually extract *all* files from the dumps?

- Toralf