BackupPC-users

Re: [BackupPC-users] Manual restore

2014-05-06 10:45:16
Subject: Re: [BackupPC-users] Manual restore
From: "Carlos Galeano" <cgaleano AT logise.com DOT gt>
To: "'General list for user discussion, questions and support'" <backuppc-users AT lists.sourceforge DOT net>
Date: Tue, 6 May 2014 08:47:13 -0600
Hi,

Thanks for your help, I really don't know how works the archive host
feature, I will investigate a little more.

Thanks to everyone.





-----Mensaje original-----
De: Les Mikesell [mailto:lesmikesell AT gmail DOT com] 
Enviado el: Monday, May 05, 2014 4:18 PM
Para: General list for user discussion, questions and support
Asunto: Re: [BackupPC-users] Manual restore

On Mon, May 5, 2014 at 4:53 PM, Carlos Galeano <cgaleano AT logise.com DOT gt>
wrote:
> okay, I'll explain to make sense.
> I need to store a copy of the backup off the server, and if this is a 
> large file is difficult to do, that's why I need to split the tar file 
> into multiple files.

You could use the 'archive host' feature of backuppc to create the
split tar archive in a directory.   On the other hand you have to go
through the web interface to start it each time and the command line version
could run automatically from cron.

> the problem is if I need to restore some files from this backup, 
> reattach the parts can be a time-consuming task.
> The one tar file can be up to 50GB, split into 5Gb each part

> Restoring a file from the web works if the backup still exists on the 
> server, but for space, I delete the backups over 60 days old.
> So my question is, can i restore files from tar files without joining 
> them again?

There's no way to tell which part would contain which files - and if you
compressed before splitting you have to uncompress from the beginning.  (Tar
isn't that great at resyncing either, although it
should try).   You probably already know you can cat/zcat the stream
piped to the tar extract, but I'll mention it just in case you were
rebuilding it in a file first.   If you know the file structure you
might be able to create separate tars of specific paths to make the likely
targets easier.

My recommendation is to keep backups online longer to make using the
archived copies rare.    All of the old files that still exist
unchanged will be pooled anyway and not taking additional space.

-- 
   Les Mikesell
    lesmikesell AT gmail DOT com

----------------------------------------------------------------------------
--
Is your legacy SCM system holding you back? Join Perforce May 7 to find out:
&#149; 3 signs your SCM is hindering your productivity &#149; Requirements
for releasing software faster &#149; Expert tips and advice for migrating
your SCM now http://p.sf.net/sfu/perforce
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


------------------------------------------------------------------------------
Is your legacy SCM system holding you back? Join Perforce May 7 to find out:
&#149; 3 signs your SCM is hindering your productivity
&#149; Requirements for releasing software faster
&#149; Expert tips and advice for migrating your SCM now
http://p.sf.net/sfu/perforce
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/