ADSM-L

Re: [ADSM-L] Data Deduplication

2007-08-29 12:37:56
Subject: Re: [ADSM-L] Data Deduplication
From: Curtis Preston <cpreston AT GLASSHOUSE DOT COM>
To: ADSM-L AT VM.MARIST DOT EDU
Date: Wed, 29 Aug 2007 12:35:39 -0400
First, I would say the only thing that this post says is the Diligent
had a better de-dupe ratio with this customer's data -- not that
Diligent's de-dupe is better overall.  The different vendors use VERY
DIFFERENT ways to scan the incoming data and identify redundant pieces
of data.   Those different ways will work better or worse for different
environments and different types of backup software and backed-up data.

I've tested a number of these products in a number of environments and
TRUST ME: your mileage will vary.  The best product for one is NOT the
best product for another.

---
W. Curtis Preston
Backup Blog @ www.backupcentral.com
VP Data Protection, GlassHouse Technologies 

-----Original Message-----
From: ADSM: Dist Stor Manager [mailto:ADSM-L AT VM.MARIST DOT EDU] On Behalf Of
Paul Zarnowski
Sent: Wednesday, August 29, 2007 8:39 AM
To: ADSM-L AT VM.MARIST DOT EDU
Subject: Re: [ADSM-L] Data Deduplication

Any idea why Diligent's dedup ratio is better?  What's different
about the dedup algorithm that makes it work better?


At 06:29 PM 8/28/2007, Curtis Preston wrote:
>That sounds about right.  Data Domain's a good product with a lot of
>happy customers, but TSM customers who are only backing up files and
>only keeping 3 versions aren't going to be among them. ;)  You've got
to
>backup database/app/email type data that does recurring full backups
>and/or keep a whole lot more than 3 versions to have de-dupe make sense
>for you.  That's not a Data Domain thing.  That's just how de-dupe
>works.
>
>In addition, it won't work if you use it as you would normally use a
>disk pool (1-2 days of backups and then move to tape).  There won't be
>anything to de-dupe against, and you'll get close to nothing.  You need
>to leave your onsite backups permanently on it for de-dupe to work.
>
>---
>W. Curtis Preston
>Backup Blog @ www.backupcentral.com
>VP Data Protection, GlassHouse Technologies
>
>-----Original Message-----
>From: ADSM: Dist Stor Manager [mailto:ADSM-L AT VM.MARIST DOT EDU] On Behalf
Of
>Dirk Kastens
>Sent: Tuesday, August 28, 2007 9:18 AM
>To: ADSM-L AT VM.MARIST DOT EDU
>Subject: Re: [ADSM-L] Data Deduplication
>
>Hi,
>
>Jon Evans wrote:
> > Dirk
> >
> > I also tried Data Domain and was not impressed. I now use Diligent's
> > Protectier and its far more impressive. Its scalable, reasonably
>priced,
> > achieves throughput of 200mb per second and better and factoring
>ratio's
> > of
> > Over 10 to 1
>
>We mainly backup normal files and only use 3 backup versions so that
the
>compression will not be more than 3:1 or 5:1. The best results can be
>achieved with databases and application data like Exchange. That's what
>the people from DataDomain said. I'm just running another test with
>MySQL and Domino data. Let's wait and see :-)
>
>--
>Regards,
>
>Dirk Kastens
>Universitaet Osnabrueck, Rechenzentrum (Computer Center)
>Albrechtstr. 28, 49069 Osnabrueck, Germany
>Tel.: +49-541-969-2347, FAX: -2470


--
Paul Zarnowski                            Ph: 607-255-4757
Manager, Storage Services                 Fx: 607-255-8521
719 Rhodes Hall, Ithaca, NY 14853-3801    Em: psz1 AT cornell DOT edu

<Prev in Thread] Current Thread [Next in Thread>