Networker

Re: [Networker] clones spread across 2 tapes instead of 1 problem.

2004-02-10 11:48:09
Subject: Re: [Networker] clones spread across 2 tapes instead of 1 problem.
From: "robert.maiello AT thomson DOT com" <robert.maiello AT THOMSON DOT COM>
To: NETWORKER AT LISTMAIL.TEMPLE DOT EDU
Date: Tue, 10 Feb 2004 11:48:25 -0500
Yes,  Networker will do that.   Not much one can do to control it.  If you
search the archives there was  mention of it and a RFE that you could add
your name to.

In Version 7.1.1 there is LGTpa 49963; "Mounting unecessary volumes
during cloning operations" ...but I don't know if that is the same thing.
With version 7.1.1 I have seen now where it will clone to one tape instead
of 2.  I can't say it does it all the time though and can't say there is
any control over it.  Has anyone else notice this with the new version??

Robert Maiello
Thomson Healthcare

On Mon, 9 Feb 2004 16:05:40 -0500, Evan Gold <egold AT FSA DOT COM> wrote:

>Hello,
>I run legato7 with LTO2 tapes that hold 400gb per tape, i have 6 tape
>drives.
>sometimes when my clones for a group kick off ( i run them automatically)
>legato will load 2 tapes and put the clones onto 2 tapes when they would
>easily fit onto one tape. The total size of the clones is only like 150gb.
>
>Is there a way to make the clones only use the minimun number of tapes
>neccesary?
>
>thank you in advance.
>
>Evan
>
>--
>Note: To sign off this list, send a "signoff networker" command via email
>to listserv AT listmail.temple DOT edu or visit the list's Web site at
>http://listmail.temple.edu/archives/networker.html where you can
>also view and post messages to the list.
>=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=

--
Note: To sign off this list, send a "signoff networker" command via email
to listserv AT listmail.temple DOT edu or visit the list's Web site at
http://listmail.temple.edu/archives/networker.html where you can
also view and post messages to the list.
=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=

<Prev in Thread] Current Thread [Next in Thread>