BackupPC-users

Re: [BackupPC-users] BackupPC as a personal backup solution?

2013-06-25 03:47:29
Subject: Re: [BackupPC-users] BackupPC as a personal backup solution?
From: Tim Connors <tconnors AT rather.puzzling DOT org>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Tue, 25 Jun 2013 17:46:10 +1000 (EST)
On Sun, 23 Jun 2013, Daniel Carrera wrote:

> Hi all,
>
> I'd like to ask an opinion question: Do you think BackupPC is a
> sensible backup solution for personal, non-enterprise,
> only-one-computer use?

Of course.  Quite easy to set up, but the others have already given you
the caveat to make sure there's some redundancy.

> I am an astronomer.

Heh.  I once lost a CPU-year of simulation data when I did an rm -rf in
the source rather than destination tree when I was trying to rsync the run
across from the supercomputer.  The supercomputer were running backups (an
Enterprisey one, where Enterprise==money and !=quality, as usual) that
silently failed to backup a path of ridiculous length[1].  Rest assured
that backuppc doesn't suffer from this problem :)

> I produce a fair amount of data that I want backed
> up and I frequently rename directories that contain 60 - 300 GB of
> data. Obviously, I don't want all of that to be re-transmitted and
> re-copied just because I moved a directory.

After a move, it will still be read and copied across the network (how
else do you compare that the file is going to be the same?  git still
reads the file before comparing with the pool too), but the pool collision
will be detected, and linked to the new location.  Sounds like backuppc
4.0.0alpha0 (just released - I suggest you don't try it just yet :) will
not need the network transfer even after a move, which will be extremely
nifty when backing up your mum's computer from across the country - just
get her to send a USB stick of all her photos, copy them into a temporary
location, back them up, then back up her computer.

But my own 300GB thesis directory only takes about 10 hours to backup onto
a 3 (SATA-II) disk ZFS pool on a small NAS box with a 4 core 64 bit atom
CPU.  It gets done in the background, so I don't notice.

> I'm not crazy about using Git for backups, but I suppose I could.
> BackupPC sounds great, but I realize that it is an enterprise solution
> that expects to run on its own separate server, probably have a local
> disk for backups, and so on. I suppose I could run Apache on my
> workstation and run BackupPC on top. I hope to get access to a file
> server next week. I don't know if it will be an NFS mount or an SSH
> login. I suppose that an NFS mount would work best for BackupPC.

I've done backuppc over NFS before, but it will be slower.  Turn on async,
turn off atimes (turn off atimes regardless, for you backuppc partition.
You don't need atimes on /var/lib/backuppc regardless of whether you need
them anywhere else)


[1] e.g.
/home/tconnors/thesis/data/tconnors/nfs_cluster_cosmic_tconnors/magellanic/papergrid/lnlam1/2.5-0.0Gyr/lmcmass2.0e10Mo/smcmass3.0e9Mo-3.0e9Mo/rad7.0d,7.0hkpc/theta45/phi210/rot0/1.5d:1.5h/0.6vd/0.2t/2.0l/part200000d,200000h/high-timeres/galout/output/data/pview_columns/g000002.processed.followsmc.columns-xproj:yproj:zproj:xscreen:yscreen:vlsr:vgsr:vsub:rp:th_lmc:ph_lmc:x_smc:y_smc:z_smc:x_mw:y_mw:z_mw:x_sun:y_sun:z_sun.pview.dat


-- 
Tim Connors

------------------------------------------------------------------------------
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/