Veritas-bu

[Veritas-bu] Checking to see if millions of files are backed up?

2007-03-28 08:24:25
Subject: [Veritas-bu] Checking to see if millions of files are backed up?
From: pkoster at ci.grand-rapids.mi.us (Koster, Phil)
Date: Wed, 28 Mar 2007 08:24:25 -0400
Justin,

Not sure what your timeline is but we run a script to pull the status
daily and compile it into a db.  That way there is not much overhead
because it is only the last 24 hours back ups but the DB gives us
flexibility to run the reports etc.

Thanks.

Phil
456-3136

-----Original Message-----
From: veritas-bu-bounces at mailman.eng.auburn.edu
[mailto:veritas-bu-bounces at mailman.eng.auburn.edu] On Behalf Of Justin
Piszcz
Sent: Monday, March 26, 2007 4:27 PM
To: Veritas-bu at mailman.eng.auburn.edu
Subject: [Veritas-bu] Checking to see if millions of files are backed
up?

If one is to create a script to ensure that the files on the filesystem
are backed upon before removing them, what is the best data-store model
for doing so?

Obviously, if you have > 1,000,000 files in the catalog and you need to
check each of those, you do not want to bplist -B -C -R 999999
/path/to/file/1.txt for each file.  However, you do not want to grep "1"
one_gigabyte_catalog.txt either as there is really too much overhead in
either case.

I have a few ideas that involves neither of these, but I was wondering
if anyone out there had already done something similar to this that was
high performance?

Justin.
_______________________________________________
Veritas-bu maillist  -  Veritas-bu at mailman.eng.auburn.edu
http://mailman.eng.auburn.edu/mailman/listinfo/veritas-bu