BackupPC-users

Re: [BackupPC-users] improving the deduplication ratio

2008-04-16 10:37:04
Subject: Re: [BackupPC-users] improving the deduplication ratio
From: Ludovic Drolez <ldrolez AT debian DOT org>
To: Tino Schwarze <backuppc.lists AT tisc DOT de>
Date: Wed, 16 Apr 2008 16:23:26 +0200
On Wed, Apr 16, 2008 at 03:23:43PM +0200, Tino Schwarze wrote:
> While it would work (and I thought about that myself), I'm not keen on
> having even more directories in the BackupPC file system. We're talking
> about hundreds of thousands of *additional* files here, if the chunk
> size is, say 64k. The File::Rsync stuff would need rewriting though (but
> I'm no developer, I'm just guessing).

Yes, and that's why a database would be more efficient than hard
links...
I've just read that Diligent ProtecTIER can store all the links for a
1 PB pool (yes ! peta bytes !), in just 4GB of memory. With the links in 
memory, they can
achieve very high backup speeds and doing on the fly deduplication.

Maybe the way to go for BackupPC ?

Cheers,

-- 
Ludovic Drolez.

http://www.palmopensource.com               - The PalmOS Open Source Portal
http://www.drolez.com      - Personal site - Linux, Zaurus and PalmOS stuff

-------------------------------------------------------------------------
This SF.net email is sponsored by the 2008 JavaOne(SM) Conference 
Don't miss this year's exciting event. There's still time to save $100. 
Use priority code J8TL2D2. 
http://ad.doubleclick.net/clk;198757673;13503038;p?http://java.sun.com/javaone
_______________________________________________
BackupPC-users mailing list
BackupPC-users AT lists.sourceforge DOT net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/