Hi Eric,
I agree, that today on slower machines these technices have an
performance impact.
But on the other hand HW improves. Data growth is tremendous, and at
midrange timescale I see and end of an powerfull concept like
"incremental backup". We are getting flodded with data, an we will spend
a lot of ressources to managing such data.
I also agree with Nicholas Cassimatis who asked, why backup the n-th
version if you have software distribituion.
In theorie he is right, but in the PC-World there are no concepts of of
a clear separation between basesoftware and the customized partion of a
software. For example are you really sure to know were your Word
settings are (Registry, INI-Files, .DOT and .DOC Files Settings for
Printing etc)?
It is not like OS/390 (formerly called MVS)! Or do you relly konw which
files to restore after a new Install of Word? And aht about all the
other products....?
As long as this point is not solved by SW-vendors, we are not able to
separate (seen from the backup point of view) the data we could rebuild
via SW-Distribution and the data we have to recover from saves. So we
are forced to do the "brute force" method to be sure to get all.
I think that point would tremendous reduce the amount of data we have to
backup.
An other approach are concepts like NC (Network Computers), but here
also we see limitations from the SW- and HW-vendors.
I still believe that Blocklevel and/or checksum-processing will be a
solution, maybe not today, but in 2-3 years it should be.
------------------------------------------
Kind Regards
Kind Regards
Andreas Buser
Tel: ++41 61 285 73 21 Fax: ++41 61 285 70 70
Email: Andreas.Buser AT Basler DOT ch
Address:
Basler Versicherungsgesellschaft
Andreas Buser
Abt. Informatik
Aeschengraben 21
4002 Basel
Switzerland
|