Amanda-Users

Re: Multi-Gb dumps using tar + software compression (gzip)?

2004-10-19 10:45:01
Subject: Re: Multi-Gb dumps using tar + software compression (gzip)?
From: Toralf Lund <toralf AT procaptura DOT com>
To: Alexander Jolk <alexj AT buf DOT com>
Date: Tue, 19 Oct 2004 16:32:31 +0200
Alexander Jolk wrote:

Joshua Baker-LePain wrote:
I think that OS and utility (i.e. gnutar and gzip) version info would be
useful here as well.

True, forgot that.  I'm on Linux 2.4.19 (Debian woody), using GNU tar
1.13.25 and gzip 1.3.2.  I have never had problems recovering files from
huge dumps.

I'm using Red Hat Linux 9 with kernel version 2.4.20 on the server, and I have clients running Linux and SGI IRIX (version 6.5.16f). tar version is 1.13.25 on both platforms; gzip is 1.3.3 on Linux, 1.2.4a on IRIX. I'm mainly having problems with IRIX clients since that's where the large filesystems are connected. These get corrupted with server as well as client compression, i.e. I've tried both gzip versions.

Alex