BackupPC-users

Re: [BackupPC-users] more efficient: dump archives over the internet or copy the whole pool?

2010-11-05 02:03:58
Subject: Re: [BackupPC-users] more efficient: dump archives over the internet or copy the whole pool?
From: Craig Barratt <cbarratt AT users.sourceforge DOT net>
To: "Frank J. G mez" <frank AT crop-circle DOT net>
Date: Thu, 4 Nov 2010 23:02:08 -0700
Frank,

> Aha!  I'm not insane!

Definitely not.

This appears to be a bug, and I'd like to get to the bottom of it.
I suspect there is some meta data, most likely a file size, that
isn't encoded correctly.

Let's take this off list.  No doubt the tar file is very large.
You should try to find the smallest tar file that shows the
problem.

Rather than extract the archive, use tar -tvf instead.  Scroll
through the output, and find the first error, most likely "tar:
Skipping to next header".  Look at the few file names prior to
that.  Try generating a tar archive with just that directory, eg,
something like this:

    /usr/share/backuppc/bin/BackupPC_tarCreate -t -h 62z62l1 -n -1 -s win7home 
/AppData/Local/Temp/Low | tar -tvf -

Try to zero-in on the deepest path that still gives an error.
If the tar file is small enough, please compress and email it
to me.

Craig

------------------------------------------------------------------------------
The Next 800 Companies to Lead America's Growth: New Video Whitepaper
David G. Thomson, author of the best-selling book "Blueprint to a 
Billion" shares his insights and actions to help propel your 
business during the next growth cycle. Listen Now!
http://p.sf.net/sfu/SAP-dev2dev
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/