I was wondering how viable it would be to backup the backuppc pool using tar?
example: tar --preserve-permissions --check-links -cvf /mnt/backups/backup2014-04-08.tar /var/lib/backuppc
I know this would become somewhat problematic on big data pools, so I guess my question would be:
Under which circumstances would this method become an issue.
I have a small client, for whom I am using backuppc as a backup solution. Unfortunately, setting up an archive host is not an option - so I am at the moment just using rsync to copy the pool accross to an external USB 3.0 Disk. This works fine, the issue is deleting old backups. Each backup of the pool goes in its own dated folder, and deleting one of these takes forever. (Due to hardlinks, no doubt :p ) I was thinking of dumping this to a uncompressed tar file instead (Deleteing a single file instead of thousands of files and more hardlinks.), as in a DR scenario, ultimately as long as the data is restore able, time is not so big an issue.
At the moment their pool is sitting at around 100GB.
Any ideas?
Regards,
Jurie Botha
------------------------------------------------------------------------------
Put Bad Developers to Shame
Dominate Development with Jenkins Continuous Integration
Continuously Automate Build, Test & Deployment
Start a new project now. Try Jenkins in the cloud.
http://p.sf.net/sfu/13600_Cloudbees _______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
|