BackupPC-users

Re: [BackupPC-users] General question about backuppc

2009-01-05 13:14:09
Subject: Re: [BackupPC-users] General question about backuppc
From: Les Mikesell <lesmikesell AT gmail DOT com>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Mon, 05 Jan 2009 12:09:53 -0600
marian.thieme AT arcor DOT de wrote:
> Hi everybody,
> 
> I am interested in backuppc to and I did a testinstallation. Also I played a 
> bit around with the configuration.
> Now I am not sure anymore if that is suited to me: I have the following 
> szenario:
> 
> A network with several lab computer that generate data (lab_pc1, lap_pc2, 
> ...).
> A central file server that should collect all the data that are generated by 
> the several lab computers.
> for instance, all the data are centrally stored in:
> /data/lab_pc1/...
> /data/lab_pc2/...
> ...
> The data should be automatically copied to the server once a day.
> 
> Afterwards I want to access the data via the fileserver. It would be nice 
> when the server is able to automatically collect the data and only collect 
> that data that are new. Furthermore, data that are removed from a lab pc 
> should still kept/held by the server.
> 
> Do you think backuppc is the suited for that ?
> Did look in the directory where the files are backed up and found a directory 
> structure like that: backuppc/pc/localhost/0/f%2fetc/.
> Thats not what I want. I want simply a mirror of my data directory on 
> lap_pcXX.

If you work through the web browser, you'll see what appears to be a 
normal directory copy for each run.  Or you can use the command line 
tool BackupPC_zcat to get a normal copy of the file on stdout.

> Is it possible to configure backuppc to simply mirroring/syncing certain data 
> directories ? (But keeping files, that were removed)

No - backuppc actually is building a common pool directory and the 
directory trees you see for each host/run just contain links to the 
pooled instance.  If you have duplicated files across the hosts this 
will save a lot of space.  If you don't, it may be better to use a 
different approach.

> Can you recommend me other tools, running under linux (debian) ?

You can simply run "rsync -a" yourself in a script that loops through 
all the targets if you want to accumulate all files.   If you want to 
reflect the current directory contents but keep old files elsewhere you 
can use the --backup-dir option.


-- 
   Les Mikesell
    lesmikesell AT gmail DOT com

------------------------------------------------------------------------------
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

<Prev in Thread] Current Thread [Next in Thread>