ADSM-L

Re: Wishlist Item

2004-09-14 10:38:58
Subject: Re: Wishlist Item
From: Doug Thorneycroft <dthorneycroft AT LACSD DOT ORG>
To: ADSM-L AT VM.MARIST DOT EDU
Date: Tue, 14 Sep 2004 07:38:51 -0700
There are products out there that take it one step further by
storing the data in "Chunks" rather than files, and storing only
one copy of each chunk. They us an algorithm on the clients to describe
the chunks, and claim excellent performance.
I've been to a presentation from Avamar, but don't know anyone using
it, so I don't know about reliability or performance.
http://www.avamar.com/

Doug Thorneycroft
County Sanitation Districts of Los Angeles County
(562) 699-7411 Ext. 1058
FAX (562) 699-6756
dthorneycroft AT lacsd DOT org



-----Original Message-----
From: Coats, Jack [mailto:Jack.Coats AT BANKSTERLING DOT COM]
Sent: Monday, September 13, 2004 1:25 PM
To: ADSM-L AT VM.MARIST DOT EDU
Subject: Wishlist Item


Yes, I know I am dreaming, but...



An open source program, pcbackup or backuppc something like that on
Sourceforge has a very nice feature.



If a file is already backed up, it only keeps one copy of that file for ALL
its clients!  What technique does it use to figure out if the files are
identical without comparing them?  I didn't research it that far, but I
assume it uses something like file size and a checksum of some kind.



Anyway, if you have significantly identical client computer you are backing
up, just keeping one rather than N copies is better than any compression
known to man!  It would be another field or so in the database for every
file, but it might be worth it!  At least as an option.



... Jack

<Prev in Thread] Current Thread [Next in Thread>