[BackupPC-users] Breaking up large backups
2017-02-08 22:10:23
We all know that rsync's weakness is very large file sets. They have to be
held in memory on both client and server to determine the change set to be
transmitted. So it's reasonable to want to be able to break up a large
backup client into multiple backup jobs by splitting subdirectories into
their own backups, and adding exclusions to the original backup. It would
be nice to be able to break up the latest snapshot from the original backup
to create a seed backup for the new subdirectory jobs. Has anyone tried to
automate this?
I have a couple machines that regularly fail their backups because they
take so long from the huge file sets so I'm about to embark on breaking
them up to deal with it.
---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
|
<Prev in Thread] |
Current Thread |
[Next in Thread>
|
- [BackupPC-users] Breaking up large backups,
Kenneth Porter <=
|
|
|