BackupPC-users

Re: [BackupPC-users] BackupPC_tarCreate Problem

2010-08-11 18:06:19
Subject: Re: [BackupPC-users] BackupPC_tarCreate Problem
From: Stephen Gelman <sgelman AT bluestarenergy DOT com>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Wed, 11 Aug 2010 17:03:47 -0500
  When I try and extract the tar file using pax, I get a lot of errors 
similar to the following:

pax: checksum error on header record : 4m^[.^UUWD/1
pax: test.tar : This doesn't look like a tar archive
pax: test.tar : Skipping to next file...
pax: checksum error on header record : CE/9j^G:ly4O
pax: test.tar : Skipping to next file...

On 08/11/2010 03:35 PM, John Rouillard wrote:
> On Wed, Aug 11, 2010 at 01:44:19PM -0500, Stephen Gelman wrote:
>> On 08/11/2010 12:33 PM, John Rouillard wrote:
>>> On Wed, Aug 11, 2010 at 10:23:59AM -0500, Stephen Gelman wrote:
>>>> I am running BackupPC 3.1.0 on Nexenta.  It seems to be working for the
>>>> most part.  I am having a problem with BackupPC_tarCreate.  I am trying
>>>> to create a tar of a 30gb backup.  The tar I create ends up being 30gb,
>>>> but when extracted it only takes up 5gb and is missing a lot of files.
>>>> I can restore the missing files using the web interface, so I know that
>>>> they are being backed up and that BackupPC has permission to access
>>>> them.  Does anyone have any idea what's going on?  The only clue I have
>>>> is that I repeatedly get "tar: Skipping to next header" when untaring
>>>> the file.
>>> Which tar are you using to do the restore: native solaris /usr/bin/tar
>>> (or /usr/sbin/static/tar), gnu tar, pax? How are you supplying the 30
>>> GB file to the restoring tar, stdin as a file on the command line ...?
>>>
>>> Do you have any compression in the picture? Also are you moving
>>> between architectures or little to big endian machines?
>>>
>> Using gnu tar.  This happens both if I pipe the output of
>> BackupPC_createTar directly to tar and if I untar from the file.  More
>> specifically, the tar command I am using is "tar -xf - -C MYDIRECTORY".
>> No compression and the archive is staying on the same server.
> I assume you are doing this locally on the nexenta box w/o ssh etc.
> That should rule out blocking issues and network corruption
> issues. But have you tried setting blocking to 20 explicitly in gnu
> tar (I think that's what BackupPC_tarCreate uses).
>
> Does:
>
>     tar -xvf - -C MYDIRECTORY
>
> change anything?
>
> What are your arguments to BackupPC_tarCreate? You aren't redirecting
> errors from BackupPC_tarCreate onto stdout using 2>&1 are you?
>
> Can you restore a subset of the 30GB of data using:
>
>     BackupPC_tarCreate ... -s share directory/path
>
> where directory/path is a directory that is not being restored in the
> 30GB backup but is present in the backuppc web interface?
>
> If you have pax installed does using it in place of tar produce better
> diagnostics (e.g. why it doesn't look like file header)?
>
> If you can get a smaller subset of restored files to produce the error
> then you could look at the tar file and try to figure out what is
> confusing tar. If you see a lot of nulls where there shouldn't be any,
> maybe bad block factor despite what I said above?
>

------------------------------------------------------------------------------
This SF.net email is sponsored by 

Make an app they can't live without
Enter the BlackBerry Developer Challenge
http://p.sf.net/sfu/RIM-dev2dev 
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/