Veritas-bu

[Veritas-bu] Netbackup Database size and speed

2000-10-17 21:41:26
Subject: [Veritas-bu] Netbackup Database size and speed
From: Patrick Boyle PBoyle AT esri DOT com
Date: Tue, 17 Oct 2000 18:41:26 -0700
Compression is the issue here, I believe.
We are over 85 GB of NBU Database, and searching restores
isn't  an issue until we start searching the database that
is compressed.  We compress after 60 days, [Delay to Compress Database]
so for most restores, the compression isn't an issue and restore/searches
are quick.  And my master is only a Sun Ultra 2 ! !

To simplify the Database file size, we have used links within the
DB to move portions off to smaller disks.
we are using 9 disks of 9 or 18 GB in size.

This has given us the ability break the DB backup into 2 piece, but
does complicate the path list and modifies the recovery process,
but is well worth the little bit of extra complexity.
If you want info on that, please ask.

Further, Sony AIT-2 provides ample tape cartridge capacity for
the DB backup, if we chose to do it all in one shot.  150 GB with Normal
compression and we get Far Better than advertised compression with AIT
drives.

hope this provides some help.

Patrick Boyle
ESRI Systems Administration
Voice:          [909] 793-2853  Ext. 1-1461
Numeric Pager:  [909] 777-2791
E-mail:         PBoyle AT esri DOT com


-----Original Message-----
From:   veritas-bu-admin AT eng.auburn DOT edu
[mailto:veritas-bu-admin AT eng.auburn DOT edu] On Behalf Of Jonathan Geibel
Sent:   Tuesday, October 17, 2000 11:41 AM
To:     veritas-bu AT mailman.eng.auburn DOT edu
Subject:        [Veritas-bu] Netbackup Database size and speed

Hello,

Question for you all:

Has anyone had any troubles with the netbackup database size getting out
of hand and creating excessively bad performance issues?  Has anyone come
up with any intuitive solutions for how to solve this problem?

With 13TBs of data being backed up every week, our netbackup db/images
directory has grown to obscene sizes (60GB of flat-file data compressed).

Because netbackup doesn't use a real database to hold this information
(simple flatfiles..) the scalability of this design is rather weak..

these huge flatfile databases are causing us major performance problems
when trying to do simple restores..  the simple act of looking for a
single file can take several hours to complete as it chugs through all of
it's backups files.

we've indexed everything, but that doesn't seem to help..

I've heard there are tools to dump this information into an actual SQL
database which could be an extremely useful tool for hunting down the
backups for specific files..

has anyone done this?  we were thinking of possible dumping the data once
a day into a database for doing quick searches on..

I'd be curious to hear if anyone else has run into these performance
problems..

Jon


=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Walt Disney Feature Animation      The Secret Lab
Senior Systems Engineer              818.526.3051
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=





_______________________________________________
Veritas-bu maillist  -  Veritas-bu AT mailman.eng.auburn DOT edu
http://mailman.eng.auburn.edu/mailman/listinfo/veritas-bu




<Prev in Thread] Current Thread [Next in Thread>