BackupPC-users

Re: [BackupPC-users] tar is needed, but deleted files not needed

2012-01-07 16:57:13
Subject: Re: [BackupPC-users] tar is needed, but deleted files not needed
From: Les Mikesell <lesmikesell AT gmail DOT com>
To: "General list for user discussion, questions and support" <backuppc-users AT lists.sourceforge DOT net>
Date: Sat, 7 Jan 2012 15:54:41 -0600
On Sat, Jan 7, 2012 at 2:25 PM, Daniel <dandadude AT gmail DOT com> wrote:
>
> Well all I see is that incremental backup with rsync can take 3 hours for
> 750 MB (30 mbit/sec bandwidth, strong computers), which is absolutely crazy.
> With tar this is 5 minutes at max.

That seems slightly extreme.  What has to happen is that the target
machine has to send the whole directory before anything starts, then
the backuppc server walks both its previous full of that machine and
the RAM copy from the target looking for directory differences.  In an
incremental, it will skip anything where the filename, timestamp, and
length match.  Where they don't match, they walk through the file
contents exchanging block checksums to find the differences.   Each
directory entry takes a small but finite amount of time, and updating
large files with changes can be slow because it is partly
uncompressed/copied from the existing version and partly copied over
the network.  You might improve things a bit if you can find something
to exclude (tmp/cache areas with a lot of filenames, database files
that are better handled in other ways, etc.).

But, 3 hours would still be a reasonable backup window for a lot of purposes.

> So you can see my concerns :-( And I have a server where the full backup
> took about 72 hours.

The first full is a special case since you copy everything, and the
2nd run will uncompress everything to do the block checksum compares.
If you use checksum caching the 3rd full will actually be the one to
time.

> I wanted the backuppc machine (with 6 TB HDD) to be able to backup all my
> servers (there are many), but if backing up of 1 server can take days with
> incremental, then this is not a solution.

If you have sufficient ram, you can run at least a couple in parallel.
 And you can skew the fulls/incrementals so they don't all end up
doing the long runs the same day.

> The truth is that I have been experimenting with many backup solutions in
> the past, and they are either very complicated to handle (bacula), or aren't
> reliable. BackupPC was the numero uno, until I discovered this. It can even
> work together with autoloader

You could try amanda if you are going to tape.  It knows how to use
the --listed-incremental feature of GNUtar.

-- 
   Les Mikesell
    lesmikesell AT gmail DOT com

------------------------------------------------------------------------------
Ridiculously easy VDI. With Citrix VDI-in-a-Box, you don't need a complex
infrastructure or vast IT resources to deliver seamless, secure access to
virtual desktops. With this all-in-one solution, easily deploy virtual 
desktops for less than the cost of PCs and save 60% on VDI infrastructure 
costs. Try it free! http://p.sf.net/sfu/Citrix-VDIinabox
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/