Networker

Re: [Networker] writing to tape

2008-04-29 12:31:15
Subject: Re: [Networker] writing to tape
From: Stan Horwitz <stan AT TEMPLE DOT EDU>
To: NETWORKER AT LISTSERV.TEMPLE DOT EDU
Date: Tue, 29 Apr 2008 12:21:55 -0400
On Apr 29, 2008, at 11:40 AM, mallard20 wrote:

I currently use an HP Storageworks 1/8 Autoloader Ultrium LTO1 jukebox. I have my data encrypted and it is currently take 4+ hours to write the data to each tape. Any ideas of why it is taking so long to write this data to tape? Any help is greatly appreciated.

Without knowing anything about the rest of your NetWorker data zone, no, I have no idea. What kind of network are you backing up over and how much data is involved? Which NetWorker version and edition are you using? What kind of client operating systems are you backing up? How is the encryption being done? What you should do is go to EMC's http://powerlink.emc.com web site and download the performance tuning guide. This document has several tips on how to optimize your backups.

--
Stan Horwitz
Temple University
Enterprise Systems Group
stan AT temple DOT edu

CONFIDENTIALITY STATEMENT: The information contained in this e-mail, including attachments, is the confidential information of, and/or is the property of, Temple University. The information is intended for use solely by the individual or entity named in the e-mail. If you are not an intended recipient or you received this in error, then any review, printing, copying, or distribution of any such information is prohibited. Please notify the sender immediately by reply e-mail and then delete this e-mail from your system.

To sign off this list, send email to listserv AT listserv.temple DOT edu and type 
"signoff networker" in the body of the email. Please write to networker-request 
AT listserv.temple DOT edu if you have any problems with this list. You can access the 
archives at http://listserv.temple.edu/archives/networker.html or
via RSS at http://listserv.temple.edu/cgi-bin/wa?RSS&L=NETWORKER

<Prev in Thread] Current Thread [Next in Thread>