A couple of other suggestions from someone who has to back up 55 million
files with an average size of 50K:
1. Disable OTM if possible.
2. Make sure the job tracker is disabled on the client machine.
I use a policy for each subset of files that I back up so I can have more
control over what runs. Also, if you use a single policy and the backup
fails, you have to restart the whole thing. If you break it up, you can
just rerun the failed portion.
Matt
-----Original Message-----
From: Gardner, Jesse [mailto:Jesse.Gardner AT storaenso DOT com]
Sent: Thursday, December 12, 2002 11:32 AM
To: (veritas-bu AT mailman.eng.auburn DOT edu)
Subject: [Veritas-bu] Breaking up images for optimization of many small
files
Netbackup 4.5 with the latest patches on Win2000 SP3, using an HP Surestore
E LTO Ultrium tape library with 2 drives and 20 slots.
We've got a situation where we're backing up almost 6 million files, about
150GB. It takes over 24 hours for a full backup. Right now we need to
restore everything from the production box to a new test server, and it is
just horrendous. I found several discussions on this mailing list that
suggest breaking up the file list into smaller chunks, so that NetBackup
doesn't have to crunch all 6 million files at one time.
My question is: To do this, do I create several entries in the file list
portion of one policy, or do I have to create multiple policies?
Jesse Gardner
510 High Street
Wisconsin Rapids, WI 54495
(715) 422-1516
_______________________________________________
Veritas-bu maillist - Veritas-bu AT mailman.eng.auburn DOT edu
http://mailman.eng.auburn.edu/mailman/listinfo/veritas-bu
|